To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]

There has been an increase in social media marketing in sport, as sports teams and clubs recognise the importance of keeping a rapport with their fans and other audiences through social media.[112] Sports personalities such as Cristiano Ronaldo have 40.7 million followers on Twitter and 49.6 million on Instagram, creating opportunities for endorsements.[113]


Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.

As the number of sites on the Web increased in the mid-to-late 1990s, search engines started appearing to help people find information quickly. Search engines developed business models to finance their services, such as pay per click programs offered by Open Text[7] in 1996 and then Goto.com[8] in 1998. Goto.com later changed its name[9] to Overture in 2001, was purchased by Yahoo! in 2003, and now offers paid search opportunities for advertisers through Yahoo! Search Marketing. Google also began to offer advertisements on search results pages in 2000 through the Google AdWords program. By 2007, pay-per-click programs proved to be primary moneymakers[10] for search engines. In a market dominated by Google, in 2009 Yahoo! and Microsoft announced the intention to forge an alliance. The Yahoo! & Microsoft Search Alliance eventually received approval from regulators in the US and Europe in February 2010.[11]


All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?
On April 24, 2012 many started to see that Google has started to penalize companies that are buying links for the purpose of passing off the rank. The Google Update was called Penguin. Since then, there have been several different Penguin/Panda updates rolled out by Google. SEM has, however, nothing to do with link buying and focuses on organic SEO and PPC management. As of October 20, 2014 Google has released three official revisions of their Penguin Update.
One of the main purposes of employing social media in marketing is as a communications tool that makes the companies accessible to those interested in their product and makes them visible to those who have no knowledge of their products.[26] These companies use social media to create buzz, and learn from and target customers. It's the only form of marketing that can finger consumers at each and every stage of the consumer decision journey.[27] Marketing through social media has other benefits as well. Of the top 10 factors that correlate with a strong Google organic search, seven are social media dependent. This means that if brands are less or non-active on social media, they tend to show up less on Google searches.[28] While platforms such as Twitter, Facebook, and Google+ have a larger number of monthly users, the visual media sharing based mobile platforms, however, garner a higher interaction rate in comparison and have registered the fastest growth and have changed the ways in which consumers engage with brand content. Instagram has an interaction rate of 1.46% with an average of 130 million users monthly as opposed to Twitter which has a .03% interaction rate with an average of 210 million monthly users.[28] Unlike traditional media that are often cost-prohibitive to many companies, a social media strategy does not require astronomical budgeting.[29]
He is the co-founder of Neil Patel Digital. The Wall Street Journal calls him a top influencer on the web, Forbes says he is one of the top 10 marketers, and Entrepreneur Magazine says he created one of the 100 most brilliant companies. Neil is a New York Times bestselling author and was recognized as a top 100 entrepreneur under the age of 30 by President Obama and a top 100 entrepreneur under the age of 35 by the United Nations.
Social media can be a useful source of market information and a way to hear customer perspectives. Blogs, content communities, and forums are platforms where individuals share their reviews and recommendations of brands, products, and services. Businesses are able to tap and analyze the customer voices and feedback generated in social media for marketing purposes;[15] in this sense the social media is a relatively inexpensive source of market intelligence which can be used by marketers and managers to track and respond to consumer-identified problems and detect market opportunities. For example, the Internet erupted with videos and pictures of iPhone 6 "bend test" which showed that the coveted phone could be bent by hand pressure. The so-called "bend gate" controversy[16] created confusion amongst customers who had waited months for the launch of the latest rendition of the iPhone. However, Apple promptly issued a statement saying that the problem was extremely rare and that the company had taken several steps to make the mobile device's case stronger and robust. Unlike traditional market research methods such as surveys, focus groups, and data mining which are time-consuming and costly, and which take weeks or even months to analyze, marketers can use social media to obtain 'live' or "real time" information about consumer behavior and viewpoints on a company's brand or products. This can be useful in the highly dynamic, competitive, fast-paced and global marketplace of the 2010s.
Paid search advertising has not been without controversy and the issue of how search engines present advertising on their search result pages has been the target of a series of studies and reports[23][24][25] by Consumer Reports WebWatch. The Federal Trade Commission (FTC) also issued a letter[26] in 2002 about the importance of disclosure of paid advertising on search engines, in response to a complaint from Commercial Alert, a consumer advocacy group with ties to Ralph Nader.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
×