Another example when the “nofollow" attribute can come handy are widget links. If you are using a third party's widget to enrich the experience of your site and engage users, check if it contains any links that you did not intend to place on your site along with the widget. Some widgets may add links to your site which are not your editorial choice and contain anchor text that you as a webmaster may not control. If removing such unwanted links from the widget is not possible, you can always disable them with “nofollow" attribute. If you create a widget for functionality or content that you provide, make sure to include the nofollow on links in the default code snippet.
Planned content begins with the creative/marketing team generating their ideas, once they have completed their ideas they send them off for approval. There is two general ways of doing so. The first is where each sector approves the plan one after another, editor, brand, followed by the legal team (Brito, 2013). Sectors may differ depending on the size and philosophy of the business. The second is where each sector is given 24 hours (or such designated time) to sign off or disapprove. If no action is given within the 24-hour period the original plan is implemented. Planned content is often noticeable to customers and is un-original or lacks excitement but is also a safer option to avoid unnecessary backlash from the public.[87] Both routes for planned content are time consuming as in the above; the first way to approval takes 72 hours to be approved. Although the second route can be significantly shorter it also holds more risk particularly in the legal department.
Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
Website saturation and popularity, or how much presence a website has on search engines, can be analyzed through the number of pages of the site that are indexed by search engines (saturation) and how many backlinks the site has (popularity). It requires pages to contain keywords people are looking for and ensure that they rank high enough in search engine rankings. Most search engines include some form of link popularity in their ranking algorithms. The following are major tools measuring various aspects of saturation and link popularity: Link Popularity, Top 10 Google Analysis, and Marketleap's Link Popularity and Search Engine Saturation.
In some contexts, the term SEM is used exclusively to mean pay per click advertising,[2] particularly in the commercial advertising and marketing communities which have a vested interest in this narrow definition. Such usage excludes the wider search marketing community that is engaged in other forms of SEM such as search engine optimization and search retargeting.
Mix up your official tweets about specials, discounts, and news with fun, brand-building tweets . Be sure to retweet when a customer has something nice to say about you, and don’t forget to answer people’s questions when possible. Using Twitter as a social media marketing tool revolves around dialog and communication, so be sure to interact as much as possible to nurture and build your following.

Measuring Success with Analytics — You can’t determine the success of your social media marketing strategies without tracking data. Google Analytics can be used as a great social media marketing tool that will help you measure your most triumphant social media marketing techniques, as well as determine which strategies are better off abandoned. Attach tracking tags to your social media marketing campaigns so that you can properly monitor them. And be sure to use the analytics within each social platform for even more insight into which of your social content is performing best with your audience.
Engagement with the social web means that customers and stakeholders are active participants rather than passive viewers. An example of these are consumer advocacy groups and groups that criticize companies (e.g., lobby groups or advocacy organizations). Social media use in a business or political context allows all consumers/citizens to express and share an opinion about a company's products, services, business practices, or a government's actions. Each participating customer, non-customer, or citizen who is participating online via social media becomes a part of the marketing department (or a challenge to the marketing effort). Whereas as other customers read their positive or negative comments or reviews. Getting consumers, potential consumers or citizens to be engaged online is fundamental to successful social media marketing.[20] With the advent of social media marketing, it has become increasingly important to gain customer interest in products and services. This can eventually be translated into buying behavior, or voting and donating behavior in a political context. New online marketing concepts of engagement and loyalty have emerged which aim to build customer participation and brand reputation.[21]

Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.


Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[70][71]
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
More than three billion people in the world are active on the Internet. Over the years, the Internet has continually gained more and more users, jumping from 738 million in 2000 all the way to 3.2 billion in 2015.[9] Roughly 81% of the current population in the United States has some type of social media profile that they engage with frequently.[10] Mobile phone usage is beneficial for social media marketing because of their web browsing capabilities which allow individuals immediate access to social networking sites. Mobile phones have altered the path-to-purchase process by allowing consumers to easily obtain pricing and product information in real time[11]. They have also allowed companies to constantly remind and update their followers. Many companies are now putting QR (Quick Response) codes along with products for individuals to access the company website or online services with their smart phones. Retailers use QR codes to facilitate consumer interaction with brands by linking the code to brand websites, promotions, product information, and any other mobile-enabled content. In addition, Real-time bidding use in the mobile advertising industry is high and rising due to its value for on-the-go web browsing. In 2012, Nexage, a provider of real time bidding in mobile advertising reported a 37% increase in revenue each month. Adfonic, another mobile advertisement publishing platform, reported an increase of 22 billion ad requests that same year.[12]
YouTube is the number one place for creating and sharing video content, and it can also be an incredibly powerful social media marketing tool. Many businesses try to create video content with the aim of having their video “go viral,” but in reality those chances are pretty slim. Instead, focus on creating useful, instructive “how-to” videos. These how-to videos also have the added benefit of ranking on the video search results of Google, so don't under-estimate the power of video content!
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
×