Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
Social media marketing, or SMM, is a form of internet marketing that involves creating and sharing content on social media networks in order to achieve your marketing and branding goals. Social media marketing includes activities like posting text and image updates, videos, and and other content that drives audience engagement, as well as paid social media advertising.
Another part of SEM is social media marketing (SMM). SMM is a type of marketing that involves exploiting social media to influence consumers that one company’s products and/or services are valuable.[22] Some of the latest theoretical advances include search engine marketing management (SEMM). SEMM relates to activities including SEO but focuses on return on investment (ROI) management instead of relevant traffic building (as is the case of mainstream SEO). SEMM also integrates organic SEO, trying to achieve top ranking without using paid means to achieve it, and pay per click SEO. For example, some of the attention is placed on the web page layout design and how content and information is displayed to the website visitor. SEO & SEM are two pillars of one marketing job and they both run side by side to produce much better results than focusing on only one pillar.
Smartphone - In this document, "mobile" or “mobile devices" refers to smartphones, such as devices running Android, iPhone, or Windows Phone. Mobile browsers are similar to desktop browsers in that they can render a broad set of the HTML5 specification, although their screen size is smaller and in almost all cases their default orientation is vertical.
Blogging website Tumblr first launched ad products on May 29, 2012.[69] Rather than relying on simple banner ads, Tumblr requires advertisers to create a Tumblr blog so the content of those blogs can be featured on the site.[70] In one year, four native ad formats were created on web and mobile, and had more than 100 brands advertising on Tumblr with 500 cumulative sponsored posts.
We love paid social advertising because it's a highly cost-effective way to expand your reach. If you play your cards right, you can get your content and offers in front of a huge audience at a very low cost. Most social media platforms offer incredibly granular targeting capabilities, allowing you to focus your budget on exactly the types of people that are most likely to be interested in your business. Below are some tips and resources for getting started with paid social media marketing:
Social networking websites allow individuals, businesses and other organizations to interact with one another and build relationships and communities online. When companies join these social channels, consumers can interact with them directly.[3] That interaction can be more personal to users than traditional methods of outbound marketing and advertising.[4] Social networking sites act as word of mouth or more precisely, e-word of mouth. The Internet's ability to reach billions across the globe has given online word of mouth a powerful voice and far reach. The ability to rapidly change buying patterns and product or service acquisition and activity to a growing number of consumers is defined as an influence network.[5] Social networking sites and blogs allow followers to "retweet" or "repost" comments made by others about a product being promoted, which occurs quite frequently on some social media sites.[6] By repeating the message, the user's connections are able to see the message, therefore reaching more people. Because the information about the product is being put out there and is getting repeated, more traffic is brought to the product/company.[4]
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[33] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[34] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[35]
Google's search engine marketing is one of the western world's marketing leaders, while its search engine marketing is its biggest source of profit.[17] Google's search engine providers are clearly ahead of the Yahoo and Bing network. The display of unknown search results is free, while advertisers are willing to pay for each click of the ad in the sponsored search results.
Look at your short- and long-term goals to choose whether to focus on organic or paid search (or both). It takes time to improve your organic search rankings, but you can launch a paid search campaign tomorrow. However, there are other considerations: the amount of traffic you need, your budget, and your marketing objectives. Once you’ve reviewed the pros and cons, you can select the search strategy that’s right for you.
Sponsored radar – Radar picks up exceptional posts from the whole Tumblr community based on their originality and creativity. It is placed on the right side next to the Dashboard, and it typically earns 120 million daily impressions. Sponsored radar allows advertisers to place their posts there to have an opportunity to earn new followers, reblogs, and likes.
While most of the links to your site will be added gradually, as people discover your content through search or other ways and link to it, Google understands that you'd like to let others know about the hard work you've put into your content. Effectively promoting your new content will lead to faster discovery by those who are interested in the same subject. As with most points covered in this document, taking these recommendations to an extreme could actually harm the reputation of your site.

Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.


Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[61] Search engines can change their algorithms, impacting a website's placement, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[62] It is considered a wise business practice for website operators to liberate themselves from dependence on search engine traffic.[63] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.
When Googlebot crawls a page, it should see the page the same way an average user does15. For optimal rendering and indexing, always allow Googlebot access to the JavaScript, CSS, and image files used by your website. If your site's robots.txt file disallows crawling of these assets, it directly harms how well our algorithms render and index your content. This can result in suboptimal rankings.
Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
LinkedIn, a professional business-related networking site, allows companies to create professional profiles for themselves as well as their business to network and meet others.[41] Through the use of widgets, members can promote their various social networking activities, such as Twitter stream or blog entries of their product pages, onto their LinkedIn profile page.[42] LinkedIn provides its members the opportunity to generate sales leads and business partners.[43] Members can use "Company Pages" similar to Facebook pages to create an area that will allow business owners to promote their products or services and be able to interact with their customers.[44] Due to spread of spam mail sent to job seeker, leading companies prefer to use LinkedIn for employee's recruitment instead using different a job portal. Additionally, companies have voiced a preference for the amount of information that can be gleaned from a LinkedIn profile, versus a limited email.[45]
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Measuring Success with Analytics — You can’t determine the success of your social media marketing strategies without tracking data. Google Analytics can be used as a great social media marketing tool that will help you measure your most triumphant social media marketing techniques, as well as determine which strategies are better off abandoned. Attach tracking tags to your social media marketing campaigns so that you can properly monitor them. And be sure to use the analytics within each social platform for even more insight into which of your social content is performing best with your audience.
However, while bidding $1,000 on every keyword and ranking #1 for every relevant search sounds nice in theory, most businesses have to play a balancing game between ranking higher and paying too much for clicks. After all, if it costs $17.56 to rank in position #1, but you can only afford to pay $5.00 per click, bidding $1,000 on a keyword to guarantee yourself the #1 position would be a great way to bid yourself out of business.

Facebook and LinkedIn are leading social media platforms where users can hyper-target their ads. Hypertargeting not only uses public profile information but also information users submit but hide from others.[17] There are several examples of firms initiating some form of online dialog with the public to foster relations with customers. According to Constantinides, Lorenzo and Gómez Borja (2008) "Business executives like Jonathan Swartz, President and CEO of Sun Microsystems, Steve Jobs CEO of Apple Computers, and McDonalds Vice President Bob Langert post regularly in their CEO blogs, encouraging customers to interact and freely express their feelings, ideas, suggestions, or remarks about their postings, the company or its products".[15] Using customer influencers (for example popular bloggers) can be a very efficient and cost-effective method to launch new products or services[18] Among the political leaders in office, Prime Minister Narendra Modi has the highest number of followers at 40 million, and President Donald Trump ranks second with 25 million followers.[19] Modi employed social media platforms to circumvent traditional media channels to reach out to the young and urban population of India which is estimated to be 200 million.


Small businesses also use social networking sites to develop their own market research on new products and services. By encouraging their customers to give feedback on new product ideas, businesses can gain valuable insights on whether a product may be accepted by their target market enough to merit full production, or not. In addition, customers will feel the company has engaged them in the process of co-creation—the process in which the business uses customer feedback to create or modify a product or service the filling a need of the target market. Such feedback can present in various forms, such as surveys, contests, polls, etc.

When I started out, I pitched SEO packages to local business owners that I met through networking, which is a good way to start building result-oriented business case studies that show the ROI (return on investment) that has been generated from your efforts. Once you have those and you can prove you consistently get results, you’ll be completely indispensable because nearly every online business succeeds or fails based on the quality of their digital marketing (and people who are really good at it are rare).
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[70][71]
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
Facebook had an estimated 144.27 million views in 2016, approximately 12.9 million per month.[109] Despite this high volume of traffic, very little has been done to protect the millions of users who log on to Facebook and other social media platforms each month. President Barack Obama tried to work with the Federal Trade Commission (FTC) to attempt to regulate data mining. He proposed the Privacy Bill of Rights, which would protect the average user from having their private information downloaded and shared with third party companies. The proposed laws would give the consumer more control over what information companies can collect.[107] President Obama was unable to pass most of these laws through congress, and it is unsure what President Trump will do with regards to social media marketing ethics.

Websites such as Delicious, Digg, Slashdot, Diigo, Stumbleupon, and Reddit are popular social bookmarking sites used in social media promotion. Each of these sites is dedicated to the collection, curation, and organization of links to other websites that users deem to be of good quality. This process is "crowdsourced", allowing amateur social media network members to sort and prioritize links by relevance and general category. Due to the large user bases of these websites, any link from one of them to another, the smaller website may in a flash crowd, a sudden surge of interest in the target website. In addition to user-generated promotion, these sites also offer advertisements within individual user communities and categories.[62] Because ads can be placed in designated communities with a very specific target audience and demographic, they have far greater potential for traffic generation than ads selected simply through cookie and browser history.[63] Additionally, some of these websites have also implemented measures to make ads more relevant to users by allowing users to vote on which ones will be shown on pages they frequent.[64] The ability to redirect large volumes of web traffic and target specific, relevant audiences makes social bookmarking sites a valuable asset for social media marketers.


The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[40] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[41] in addition to their URL submission console.[42] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[43] however, this practice was discontinued in 2009.
×