In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[33] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[34] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[35]

Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[64] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[65] As of 2006, Google had an 85–90% market share in Germany.[66] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[66] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[67] That market share is achieved in a number of countries.

On April 24, 2012 many started to see that Google has started to penalize companies that are buying links for the purpose of passing off the rank. The Google Update was called Penguin. Since then, there have been several different Penguin/Panda updates rolled out by Google. SEM has, however, nothing to do with link buying and focuses on organic SEO and PPC management. As of October 20, 2014 Google has released three official revisions of their Penguin Update.
Keyword research and analysis involves three "steps": ensuring the site can be indexed in the search engines, finding the most relevant and popular keywords for the site and its products, and using those keywords on the site in a way that will generate and convert traffic. A follow-on effect of keyword analysis and research is the search perception impact.[13] Search perception impact describes the identified impact of a brand's search results on consumer perception, including title and meta tags, site indexing, and keyword focus. As online searching is often the first step for potential consumers/customers, the search perception impact shapes the brand impression for each individual.
Engagement in social media for the purpose of a social media strategy is divided into two parts. The first is proactive, regular posting of new online content. This can be seen through digital photos, digital videos, text, and conversations. It is also represented through sharing of content and information from others via weblinks. The second part is reactive conversations with social media users responding to those who reach out to your social media profiles through commenting or messaging.[22] Traditional media such as TV news shows are limited to one-way interaction with customers or 'push and tell' where only specific information is given to the customer with few or limited mechanisms to obtain customer feedback. Traditional media such as physical newspapers, do give readers the option of sending a letter to the editor. Though, this is a relatively slow process, as the editorial board has to review the letter and decide if it is appropriate for publication. On the other hand, social media is participative and open; Participants are able to instantly share their views on brands, products, and services. Traditional media gave control of message to the marketer, whereas social media shifts the balance to the consumer or citizen.

There are many reasons explaining why advertisers choose the SEM strategy. First, creating a SEM account is easy and can build traffic quickly based on the degree of competition. The shopper who uses the search engine to find information tends to trust and focus on the links showed in the results pages. However, a large number of online sellers do not buy search engine optimization to obtain higher ranking lists of search results, but prefer paid links. A growing number of online publishers are allowing search engines such as Google to crawl content on their pages and place relevant ads on it.[16] From an online seller's point of view, this is an extension of the payment settlement and an additional incentive to invest in paid advertising projects. Therefore, it is virtually impossible for advertisers with limited budgets to maintain the highest rankings in the increasingly competitive search market.

Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.
In addition to giving you insight into the search volume and competition level of keywords, most keyword research tools will also give you detailed information about the average or current estimated CPC for particular keywords are. This is particularly important for businesses with smaller ad budgets and this feature allows you to predict whether certain keywords will be truly beneficial to your ad campaigns or if they’ll cost too much.
Google's search engine marketing is one of the western world's marketing leaders, while its search engine marketing is its biggest source of profit.[17] Google's search engine providers are clearly ahead of the Yahoo and Bing network. The display of unknown search results is free, while advertisers are willing to pay for each click of the ad in the sponsored search results.
SEM is the wider discipline that incorporates SEO. SEM includes both paid search results (using tools like Google Adwords or Bing Ads, formerly known as Microsoft adCenter) and organic search results (SEO). SEM uses paid advertising with AdWords or Bing Ads, pay per click (particularly beneficial for local providers as it enables potential consumers to contact a company directly with one click), article submissions, advertising and making sure SEO has been done. A keyword analysis is performed for both SEO and SEM, but not necessarily at the same time. SEM and SEO both need to be monitored and updated frequently to reflect evolving best practices.

In addition, social media platforms have become extremely aware of their users and collect information about their viewers to connect with them in various ways. Social-networking website Facebook Inc. is quietly working on a new advertising system that would let marketers target users with ads based on the massive amounts of information people reveal on the site about themselves.[104] This may be an unethical or ethical feature to some individuals. Some people may react negatively because they believe it is an invasion of privacy. On the other hand, some individuals may enjoy this feature because their social network recognizes their interests and sends them particular advertisements pertaining to those interests. Consumers like to network with people who have interests and desires that are similar to their own.[105] Individuals who agree to have their social media profile public, should be aware that advertisers have the ability to take information that interests them to be able to send them information and advertisements to boost their sales. Managers invest in social media to foster relationships and interact with customers.[106] This is an ethical way for managers to send messages about their advertisements and products to their consumers.

The Internet and social networking leaks are one of the issues facing traditional advertising. Video and print ads are often leaked to the world via the Internet earlier than they are scheduled to premiere. Social networking sites allow those leaks to go viral, and be seen by many users more quickly. The time difference is also a problem facing traditional advertisers. When social events occur and are broadcast on television, there is often a time delay between airings on the east coast and west coast of the United States. Social networking sites have become a hub of comment and interaction concerning the event. This allows individuals watching the event on the west coast (time-delayed) to know the outcome before it airs. The 2011 Grammy Awards highlighted this problem. Viewers on the west coast learned who won different awards based on comments made on social networking sites by individuals watching live on the east coast.[92] Since viewers knew who won already, many tuned out and ratings were lower. All the advertisement and promotion put into the event was lost because viewers didn't have a reason to watch.[according to whom?]


Marketers target influential people on social media who are recognised as being opinion leaders and opinion-formers to send messages to their target audiences and amplify the impact of their message. A social media post by an opinion leader can have a much greater impact (via the forwarding of the post or "liking" of the post) than a social media post by a regular user. Marketers have come to the understanding that "consumers are more prone to believe in other individuals" who they trust (Sepp, Liljander, & Gummerus, 2011). OL's and OF's can also send their own messages about products and services they choose (Fill, Hughes, & De Francesco, 2013, p. 216). The reason the opinion leader or formers have such a strong following base is because their opinion is valued or trusted (Clement, Proppe, & Rott, 2007). They can review products and services for their followings, which can be positive or negative towards the brand. OL's and OF's are people who have a social status and because of their personality, beliefs, values etc. have the potential to influence other people (Kotler, Burton, Deans, Brown, & Armstrong, 2013, p. 189). They usually have a large number of followers otherwise known as their reference, membership or aspirational group (Kotler, Burton, Deans, Brown, & Armstrong, 2013, p. 189. By having an OL or OF support a brands product by posting a photo, video or written recommendation on a blog, the following may be influenced and because they trust the OL/OF a high chance of the brand selling more products or creating a following base. Having an OL/OF helps spread word of mouth talk amongst reference groups and/or memberships groups e.g. family, friends, work-friends etc. (Kotler, Burton, Deans, Brown, & Armstrong, 2013, p. 189).[81][82][83][84][84][84] The adjusted communication model shows the use of using opinion leaders and opinion formers. The sender/source gives the message to many, many OL's/OF's who pass the message on along with their personal opinion, the receiver (followers/groups) form their own opinion and send their personal message to their group (friends, family etc.) (Dahlen, Lange, & Smith, 2010, p. 39).[85]


When referring to the homepage, a trailing slash after the hostname is optional since it leads to the same content ("https://example.com/" is the same as "https://example.com"). For the path and filename, a trailing slash would be seen as a different URL (signaling either a file or a directory), for example, "https://example.com/fish" is not the same as "https://example.com/fish/".
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Keyword research and analysis involves three "steps": ensuring the site can be indexed in the search engines, finding the most relevant and popular keywords for the site and its products, and using those keywords on the site in a way that will generate and convert traffic. A follow-on effect of keyword analysis and research is the search perception impact.[13] Search perception impact describes the identified impact of a brand's search results on consumer perception, including title and meta tags, site indexing, and keyword focus. As online searching is often the first step for potential consumers/customers, the search perception impact shapes the brand impression for each individual.
The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content42 for ranking, parsing structured data, and generating snippets.
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.

On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[68][69]

As the number of sites on the Web increased in the mid-to-late 1990s, search engines started appearing to help people find information quickly. Search engines developed business models to finance their services, such as pay per click programs offered by Open Text[7] in 1996 and then Goto.com[8] in 1998. Goto.com later changed its name[9] to Overture in 2001, was purchased by Yahoo! in 2003, and now offers paid search opportunities for advertisers through Yahoo! Search Marketing. Google also began to offer advertisements on search results pages in 2000 through the Google AdWords program. By 2007, pay-per-click programs proved to be primary moneymakers[10] for search engines. In a market dominated by Google, in 2009 Yahoo! and Microsoft announced the intention to forge an alliance. The Yahoo! & Microsoft Search Alliance eventually received approval from regulators in the US and Europe in February 2010.[11]


As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
Website saturation and popularity, or how much presence a website has on search engines, can be analyzed through the number of pages of the site that are indexed by search engines (saturation) and how many backlinks the site has (popularity). It requires pages to contain keywords people are looking for and ensure that they rank high enough in search engine rankings. Most search engines include some form of link popularity in their ranking algorithms. The following are major tools measuring various aspects of saturation and link popularity: Link Popularity, Top 10 Google Analysis, and Marketleap's Link Popularity and Search Engine Saturation.
Smartphone - In this document, "mobile" or “mobile devices" refers to smartphones, such as devices running Android, iPhone, or Windows Phone. Mobile browsers are similar to desktop browsers in that they can render a broad set of the HTML5 specification, although their screen size is smaller and in almost all cases their default orientation is vertical.
The platform of social media is another channel or site that business' and brands must seek to influence the content of. In contrast with pre-Internet marketing, such as TV ads and newspaper ads, in which the marketer controlled all aspects of the ad, with social media, users are free to post comments right below an online ad or an online post by a company about its product. Companies are increasing using their social media strategy as part of their traditional marketing effort using magazines, newspapers, radio advertisements, television advertisements. Since in the 2010s, media consumers are often using multiple platforms at the same time (e.g., surfing the Internet on a tablet while watching a streaming TV show), marketing content needs to be consistent across all platforms, whether traditional or new media. Heath (2006) wrote about the extent of attention businesses should give to their social media sites. It is about finding a balance between frequently posting but not over posting. There is a lot more attention to be paid towards social media sites because people need updates to gain brand recognition. Therefore, a lot more content is need and this can often be unplanned content.[86]
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
×