Social media marketing involves the use of social networks, consumer's online brand-related activities (COBRA) and electronic word of mouth (eWOM)[75][76] to successfully advertise online. Social networks such as Facebook and Twitter provide advertisers with information about the likes and dislikes of their consumers.[61] This technique is crucial, as it provides the businesses with a "target audience".[61] With social networks, information relevant to the user's likes is available to businesses; who then advertise accordingly. Activities such as uploading a picture of your "new Converse sneakers to Facebook[75]" is an example of a COBRA.[75][76] Electronic recommendations and appraisals are a convenient manner to have a product promoted via "consumer-to-consumer interactions.[75] An example of eWOM would be an online hotel review;[77] the hotel company can have two possible outcomes based on their service. A good service would result in a positive review which gets the hotel free advertising via social media. However, a poor service will result in a negative consumer review which can potentially harm the company's reputation[78].
Link text is the visible text inside a link. This text tells users and Google something about the page you're linking to. Links on your page may be internal—pointing to other pages on your site—or external—leading to content on other sites. In either of these cases, the better your anchor text is, the easier it is for users to navigate and for Google to understand what the page you're linking to is about.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.

Social media can be used not only as public relations and direct marketing tools but also as communication channels targeting very specific audiences with social media influencers and social media personalities and as effective customer engagement tools.[15] Technologies predating social media, such as broadcast TV and newspapers can also provide advertisers with a fairly targeted audience, given that an ad placed during a sports game broadcast or in the sports section of a newspaper is likely to be read by sports fans. However, social media websites can target niche markets even more precisely. Using digital tools such as Google Adsense, advertisers can target their ads to very specific demographics, such as people who are interested in social entrepreneurship, political activism associated with a particular political party, or video gaming. Google Adsense does this by looking for keywords in social media user's online posts and comments. It would be hard for a TV station or paper-based newspaper to provide ads that are this targeted (though not impossible, as can be seen with "special issue" sections on niche issues, which newspapers can use to sell targeted ads).
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
×