To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually  ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]

While most of the links to your site will be added gradually, as people discover your content through search or other ways and link to it, Google understands that you'd like to let others know about the hard work you've put into your content. Effectively promoting your new content will lead to faster discovery by those who are interested in the same subject. As with most points covered in this document, taking these recommendations to an extreme could actually harm the reputation of your site.
Smartphone - In this document, "mobile" or “mobile devices" refers to smartphones, such as devices running Android, iPhone, or Windows Phone. Mobile browsers are similar to desktop browsers in that they can render a broad set of the HTML5 specification, although their screen size is smaller and in almost all cases their default orientation is vertical.
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[40] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[41] in addition to their URL submission console.[42] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[43] however, this practice was discontinued in 2009.

In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
Imagine that you've created the definitive Web site on a subject -- we'll use skydiving as an example. Your site is so new that it's not even listed on any SERPs yet, so your first step is to submit your site to search engines like Google and Yahoo. The Web pages on your skydiving site include useful information, exciting photographs and helpful links guiding visitors to other resources. Even with the best information about skydiving on the Web, your site may not crack the top page of results on major search engines. When people search for the term "skydiving," they could end up going to inferior Web sites because yours isn't in the top results.
Creating high quality content takes a significant amount of at least one of the following: time, effort, expertise, and talent/skill. Content should be factually accurate, clearly written, and comprehensive. So, for example, if you describe your page as a recipe, provide a complete recipe that is easy to follow, rather than just a set of ingredients or a basic description of the dish.
Facebook had an estimated 144.27 million views in 2016, approximately 12.9 million per month.[109] Despite this high volume of traffic, very little has been done to protect the millions of users who log on to Facebook and other social media platforms each month. President Barack Obama tried to work with the Federal Trade Commission (FTC) to attempt to regulate data mining. He proposed the Privacy Bill of Rights, which would protect the average user from having their private information downloaded and shared with third party companies. The proposed laws would give the consumer more control over what information companies can collect.[107] President Obama was unable to pass most of these laws through congress, and it is unsure what President Trump will do with regards to social media marketing ethics.
There are many reasons explaining why advertisers choose the SEM strategy. First, creating a SEM account is easy and can build traffic quickly based on the degree of competition. The shopper who uses the search engine to find information tends to trust and focus on the links showed in the results pages. However, a large number of online sellers do not buy search engine optimization to obtain higher ranking lists of search results, but prefer paid links. A growing number of online publishers are allowing search engines such as Google to crawl content on their pages and place relevant ads on it.[16] From an online seller's point of view, this is an extension of the payment settlement and an additional incentive to invest in paid advertising projects. Therefore, it is virtually impossible for advertisers with limited budgets to maintain the highest rankings in the increasingly competitive search market.
Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
Websites such as Delicious, Digg, Slashdot, Diigo, Stumbleupon, and Reddit are popular social bookmarking sites used in social media promotion. Each of these sites is dedicated to the collection, curation, and organization of links to other websites that users deem to be of good quality. This process is "crowdsourced", allowing amateur social media network members to sort and prioritize links by relevance and general category. Due to the large user bases of these websites, any link from one of them to another, the smaller website may in a flash crowd, a sudden surge of interest in the target website. In addition to user-generated promotion, these sites also offer advertisements within individual user communities and categories.[62] Because ads can be placed in designated communities with a very specific target audience and demographic, they have far greater potential for traffic generation than ads selected simply through cookie and browser history.[63] Additionally, some of these websites have also implemented measures to make ads more relevant to users by allowing users to vote on which ones will be shown on pages they frequent.[64] The ability to redirect large volumes of web traffic and target specific, relevant audiences makes social bookmarking sites a valuable asset for social media marketers.
Twitter allows companies to promote their products in short messages known as tweets limited to 140 characters which appear on followers' Home timelines.[33] Tweets can contain text, Hashtag, photo, video, Animated GIF, Emoji, or links to the product's website and other social media profiles, etc.[34] Twitter is also used by companies to provide customer service.[35] Some companies make support available 24/7 and answer promptly, thus improving brand loyalty and appreciation.
Sponsored radar – Radar picks up exceptional posts from the whole Tumblr community based on their originality and creativity. It is placed on the right side next to the Dashboard, and it typically earns 120 million daily impressions. Sponsored radar allows advertisers to place their posts there to have an opportunity to earn new followers, reblogs, and likes.
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?
Imagine that you've created the definitive Web site on a subject -- we'll use skydiving as an example. Your site is so new that it's not even listed on any SERPs yet, so your first step is to submit your site to search engines like Google and Yahoo. The Web pages on your skydiving site include useful information, exciting photographs and helpful links guiding visitors to other resources. Even with the best information about skydiving on the Web, your site may not crack the top page of results on major search engines. When people search for the term "skydiving," they could end up going to inferior Web sites because yours isn't in the top results.
This involves tracking the volume of visits, leads, and customers to a website from the individual social channel. Google Analytics[110] is a free tool that shows the behavior and other information, such as demographics and device type used, of website visitors from social networks. This and other commercial offers can aid marketers in choosing the most effective social networks and social media marketing activities.
Organic search (SEO): When you enter a keyword or phrase into a search engine like Google or Yahoo!, the organic results are displayed in the main body of the page.When your prospects search for information about your products and services, you want to rank highly in search engine results. By “optimizing” your site, you can improve your ranking for important search terms and phrases (“keywords”). You can also improve your rank by getting other important sites to link to yours.

Paid search advertising has not been without controversy and the issue of how search engines present advertising on their search result pages has been the target of a series of studies and reports[23][24][25] by Consumer Reports WebWatch. The Federal Trade Commission (FTC) also issued a letter[26] in 2002 about the importance of disclosure of paid advertising on search engines, in response to a complaint from Commercial Alert, a consumer advocacy group with ties to Ralph Nader.

Mix up your official tweets about specials, discounts, and news with fun, brand-building tweets . Be sure to retweet when a customer has something nice to say about you, and don’t forget to answer people’s questions when possible. Using Twitter as a social media marketing tool revolves around dialog and communication, so be sure to interact as much as possible to nurture and build your following.


However, while bidding $1,000 on every keyword and ranking #1 for every relevant search sounds nice in theory, most businesses have to play a balancing game between ranking higher and paying too much for clicks. After all, if it costs $17.56 to rank in position #1, but you can only afford to pay $5.00 per click, bidding $1,000 on a keyword to guarantee yourself the #1 position would be a great way to bid yourself out of business.
Popular social media such as Facebook, Twitter, LinkedIn, and other social networks can provide marketers with a hard number of how large their audience is nevertheless a large audience may not always translate into a large sales volumes. Therefore, an effective SMM cannot be measured by a large audience but rather by vigorous audience activity such as social shares, re-tweets etc.
Yelp consists of a comprehensive online index of business profiles. Businesses are searchable by location, similar to Yellow Pages. The website is operational in seven different countries, including the United States and Canada. Business account holders are allowed to create, share, and edit business profiles. They may post information such as the business location, contact information, pictures, and service information. The website further allows individuals to write, post reviews about businesses, and rate them on a five-point scale. Messaging and talk features are further made available for general members of the website, serving to guide thoughts and opinions.[49]
In some contexts, the term SEM is used exclusively to mean pay per click advertising,[2] particularly in the commercial advertising and marketing communities which have a vested interest in this narrow definition. Such usage excludes the wider search marketing community that is engaged in other forms of SEM such as search engine optimization and search retargeting.
In May 2014, Instagram had over 200 million users. The user engagement rate of Instagram was 15 times higher than of Facebook and 25 times higher than that of Twitter.[50] According to Scott Galloway, the founder of L2 and a professor of marketing at New York University's Stern School of Business, latest studies estimate that 93% of prestige brands have an active presence on Instagram and include it in their marketing mix.[51] When it comes to brands and businesses, Instagram's goal is to help companies to reach their respective audiences through captivating imagery in a rich, visual environment.[52] Moreover, Instagram provides a platform where user and company can communicate publicly and directly, making itself an ideal platform for companies to connect with their current and potential customers.[53]
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
Facebook had an estimated 144.27 million views in 2016, approximately 12.9 million per month.[109] Despite this high volume of traffic, very little has been done to protect the millions of users who log on to Facebook and other social media platforms each month. President Barack Obama tried to work with the Federal Trade Commission (FTC) to attempt to regulate data mining. He proposed the Privacy Bill of Rights, which would protect the average user from having their private information downloaded and shared with third party companies. The proposed laws would give the consumer more control over what information companies can collect.[107] President Obama was unable to pass most of these laws through congress, and it is unsure what President Trump will do with regards to social media marketing ethics.
Social media can be a useful source of market information and a way to hear customer perspectives. Blogs, content communities, and forums are platforms where individuals share their reviews and recommendations of brands, products, and services. Businesses are able to tap and analyze the customer voices and feedback generated in social media for marketing purposes;[15] in this sense the social media is a relatively inexpensive source of market intelligence which can be used by marketers and managers to track and respond to consumer-identified problems and detect market opportunities. For example, the Internet erupted with videos and pictures of iPhone 6 "bend test" which showed that the coveted phone could be bent by hand pressure. The so-called "bend gate" controversy[16] created confusion amongst customers who had waited months for the launch of the latest rendition of the iPhone. However, Apple promptly issued a statement saying that the problem was extremely rare and that the company had taken several steps to make the mobile device's case stronger and robust. Unlike traditional market research methods such as surveys, focus groups, and data mining which are time-consuming and costly, and which take weeks or even months to analyze, marketers can use social media to obtain 'live' or "real time" information about consumer behavior and viewpoints on a company's brand or products. This can be useful in the highly dynamic, competitive, fast-paced and global marketplace of the 2010s.

In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.

×