You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
Facebook Ads and other social media ad platforms, for example, are pay-per-click platforms that do not fall under the SEM category. Instead of showing your ads to people who are searching for similar content like search ads do, social media sites introduce your product to people who happen to be just browsing through their feeds. These are two very, very different types of online advertising.

On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[68][69]

The code of ethics that is affiliated with traditional marketing can also be applied to social media. However, with social media being so personal and international, there is another list of complications and challenges that come along with being ethical online. With the invention of social media, the marketer no longer has to focus solely on the basic demographics and psychographics given from television and magazines, but now they can see what consumers like to hear from advertisers, how they engage online, and what their needs and wants are.[101] The general concept of being ethical while marking on social network sites is to be honest with the intentions of the campaign, avoid false advertising, be aware of user privacy conditions (which means not using consumers' private information for gain), respect the dignity of persons in the shared online community, and claim responsibility for any mistakes or mishaps that are results of your marketing campaign.[102] Most social network marketers use websites like Facebook and MySpace to try to drive traffic to another website.[103] While it is ethical to use social networking websites to spread a message to people who are genuinely interested, many people game the system with auto-friend adding programs and spam messages and bulletins. Social networking websites are becoming wise to these practices, however, and are effectively weeding out and banning offenders.
Social networking websites are based on building virtual communities that allow consumers to express their needs, wants and values, online. Social media marketing then connects these consumers and audiences to businesses that share the same needs, wants, and values. Through social networking sites, companies can keep in touch with individual followers. This personal interaction can instill a feeling of loyalty into followers and potential customers. Also, by choosing whom to follow on these sites, products can reach a very narrow target audience.[4] Social networking sites also include much information about what products and services prospective clients might be interested in. Through the use of new semantic analysis technologies, marketers can detect buying signals, such as content shared by people and questions posted online. An understanding of buying signals can help sales people target relevant prospects and marketers run micro-targeted campaigns.
In 2007, Google announced a campaign against paid links that transfer PageRank.[30] On June 15, 2009, Google disclosed that they had taken measures to mitigate the effects of PageRank sculpting by use of the nofollow attribute on links. Matt Cutts, a well-known software engineer at Google, announced that Google Bot would no longer treat any nofollow links, in the same way, to prevent SEO service providers from using nofollow for PageRank sculpting.[31] As a result of this change the usage of nofollow led to evaporation of PageRank. In order to avoid the above, SEO engineers developed alternative techniques that replace nofollowed tags with obfuscated JavaScript and thus permit PageRank sculpting. Additionally several solutions have been suggested that include the usage of iframes, Flash and JavaScript.[32]
Marketers target influential people on social media who are recognised as being opinion leaders and opinion-formers to send messages to their target audiences and amplify the impact of their message. A social media post by an opinion leader can have a much greater impact (via the forwarding of the post or "liking" of the post) than a social media post by a regular user. Marketers have come to the understanding that "consumers are more prone to believe in other individuals" who they trust (Sepp, Liljander, & Gummerus, 2011). OL's and OF's can also send their own messages about products and services they choose (Fill, Hughes, & De Francesco, 2013, p. 216). The reason the opinion leader or formers have such a strong following base is because their opinion is valued or trusted (Clement, Proppe, & Rott, 2007). They can review products and services for their followings, which can be positive or negative towards the brand. OL's and OF's are people who have a social status and because of their personality, beliefs, values etc. have the potential to influence other people (Kotler, Burton, Deans, Brown, & Armstrong, 2013, p. 189). They usually have a large number of followers otherwise known as their reference, membership or aspirational group (Kotler, Burton, Deans, Brown, & Armstrong, 2013, p. 189. By having an OL or OF support a brands product by posting a photo, video or written recommendation on a blog, the following may be influenced and because they trust the OL/OF a high chance of the brand selling more products or creating a following base. Having an OL/OF helps spread word of mouth talk amongst reference groups and/or memberships groups e.g. family, friends, work-friends etc. (Kotler, Burton, Deans, Brown, & Armstrong, 2013, p. 189).[81][82][83][84][84][84] The adjusted communication model shows the use of using opinion leaders and opinion formers. The sender/source gives the message to many, many OL's/OF's who pass the message on along with their personal opinion, the receiver (followers/groups) form their own opinion and send their personal message to their group (friends, family etc.) (Dahlen, Lange, & Smith, 2010, p. 39).[85]
SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay per click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is the practice of designing, running and optimizing search engine ad campaigns.[56] Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. Its purpose regards prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[57] A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[58] In November 2015, Google released a full 160 page version of its Search Quality Rating Guidelines to the public,[59] which revealed a shift in their focus towards "usefulness" and mobile search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016 where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device [60]. Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and how user-friendly it is.
Let’s say, for example, that you run a construction business that helps with home repairs after natural disasters and you want to advertise that service. The official term for the service is “fire restoration,” but keyword research may indicate that customers in your area search instead for “fire repair” or “repair fire damage to house.” By not optimizing for these two keywords, you’ll lose out on a lot of traffic and potential customers, even if “fire restoration” is technically more correct.

A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, leftmost link and list the more specific sections out to the right. We recommend using breadcrumb structured data markup28 when showing breadcrumbs.
Another reason is that if you're using an image as a link, the alt text for that image will be treated similarly to the anchor text of a text link. However, we don't recommend using too many images for links in your site's navigation when text links could serve the same purpose. Lastly, optimizing your image filenames and alt text makes it easier for image search projects like Google Image Search to better understand your images.

23snaps Amikumu aNobii AsianAve Ask.fm Badoo Cloob Cyworld Diaspora Draugiem.lv Ello Facebook Foursquare Gab Hello Hi5 Highlight Houseparty Idka Instagram IGTV IRC-Galleria Keek LiveJournal Lifeknot LockerDome Marco Polo Mastodon MeetMe Meetup Miaopai micro.blog Minds MixBit Mixi Myspace My World Nasza-klasa.pl Nextdoor OK.ru Path Peach Periscope Pinterest Pixnet Plurk Qzone Readgeek Renren Sina Weibo Slidely Snapchat SNOW Spaces Streetlife StudiVZ Swarm Tagged Taringa! Tea Party Community TikTok Tinder Tout Tuenti TV Time Tumblr Twitter Untappd Vero VK Whisper Xanga Yo
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.

The fee structure is both a filter against superfluous submissions and a revenue generator. Typically, the fee covers an annual subscription for one webpage, which will automatically be catalogued on a regular basis. However, some companies are experimenting with non-subscription based fee structures where purchased listings are displayed permanently. A per-click fee may also apply. Each search engine is different. Some sites allow only paid inclusion, although these have had little success. More frequently, many search engines, like Yahoo!,[18] mix paid inclusion (per-page and per-click fee) with results from web crawling. Others, like Google (and as of 2006, Ask.com[19][20]), do not let webmasters pay to be in their search engine listing (advertisements are shown separately and labeled as such).
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
×