In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
To this end, companies make use of platforms such as Facebook, Twitter, YouTube, and Instagram to reach audiences much wider than through the use of traditional print/TV/radio advertisements alone at a fraction of the cost, as most social networking sites can be used at little or no cost (however, some websites charge companies for premium services). This has changed the ways that companies approach to interact with customers, as a substantial percentage of consumer interactions are now being carried out over online platforms with much higher visibility. Customers can now post reviews of products and services, rate customer service, and ask questions or voice concerns directly to companies through social media platforms. According to Measuring Success, over 80% of consumers use the web to research products and services.[30] Thus social media marketing is also used by businesses in order to build relationships of trust with consumers.[31] To this aim, companies may also hire personnel to specifically handle these social media interactions, who usually report under the title of Online community managers. Handling these interactions in a satisfactory manner can result in an increase of consumer trust. To both this aim and to fix the public's perception of a company, 3 steps are taken in order to address consumer concerns, identifying the extent of the social chatter, engaging the influencers to help, and developing a proportional response.[32]
Social media often feeds into the discovery of new content such as news stories, and “discovery” is a search activity. Social media can also help build links that in turn support into SEO efforts. Many people also perform searches at social media sites to find social media content. Social connections may also impact the relevancy of some search results, either within a social media network or at a ‘mainstream’ search engine.
Social media marketing provides organizations with a way to connect with their customers. However, organizations must protect their information as well as closely watch comments and concerns on the social media they use. A flash poll done on 1225 IT executives from 33 countries revealed that social media mishaps caused organizations a combined $4.3 million in damages in 2010.[93] The top three social media incidents an organization faced during the previous year included employees sharing too much information in public forums, loss or exposure of confidential information, and increased exposure to litigation.[93] Due to the viral nature of the Internet, a mistake by a single employee has in some cases shown to result in devastating consequences for organizations. An example of a social media mishap includes designer Kenneth Cole's Twitter mishap in 2011. When Kenneth Cole tweeted, "Millions are in uproar in #Cairo. Rumor has they heard our new spring collection is now available online at [Kenneth Cole's website]".[94] This reference to the 2011 Egyptian revolution drew an objection from the public; it was widely objected to on the Internet.[94] Kenneth Cole realized his mistake shortly after and responded with a statement apologizing for the tweet.[95]
Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.
A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.

Paid inclusion is a search engine marketing method in itself, but also a tool of search engine optimization, since experts and firms can test out different approaches to improving ranking and see the results often within a couple of days, instead of waiting weeks or months. Knowledge gained this way can be used to optimize other web pages, without paying the search engine company.


Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
×