When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
More than three billion people in the world are active on the Internet. Over the years, the Internet has continually gained more and more users, jumping from 738 million in 2000 all the way to 3.2 billion in 2015.[9] Roughly 81% of the current population in the United States has some type of social media profile that they engage with frequently.[10] Mobile phone usage is beneficial for social media marketing because of their web browsing capabilities which allow individuals immediate access to social networking sites. Mobile phones have altered the path-to-purchase process by allowing consumers to easily obtain pricing and product information in real time[11]. They have also allowed companies to constantly remind and update their followers. Many companies are now putting QR (Quick Response) codes along with products for individuals to access the company website or online services with their smart phones. Retailers use QR codes to facilitate consumer interaction with brands by linking the code to brand websites, promotions, product information, and any other mobile-enabled content. In addition, Real-time bidding use in the mobile advertising industry is high and rising due to its value for on-the-go web browsing. In 2012, Nexage, a provider of real time bidding in mobile advertising reported a 37% increase in revenue each month. Adfonic, another mobile advertisement publishing platform, reported an increase of 22 billion ad requests that same year.[12]
Facebook Ads and other social media ad platforms, for example, are pay-per-click platforms that do not fall under the SEM category. Instead of showing your ads to people who are searching for similar content like search ads do, social media sites introduce your product to people who happen to be just browsing through their feeds. These are two very, very different types of online advertising.
Planned content begins with the creative/marketing team generating their ideas, once they have completed their ideas they send them off for approval. There is two general ways of doing so. The first is where each sector approves the plan one after another, editor, brand, followed by the legal team (Brito, 2013). Sectors may differ depending on the size and philosophy of the business. The second is where each sector is given 24 hours (or such designated time) to sign off or disapprove. If no action is given within the 24-hour period the original plan is implemented. Planned content is often noticeable to customers and is un-original or lacks excitement but is also a safer option to avoid unnecessary backlash from the public.[87] Both routes for planned content are time consuming as in the above; the first way to approval takes 72 hours to be approved. Although the second route can be significantly shorter it also holds more risk particularly in the legal department.
This involves tracking the volume of visits, leads, and customers to a website from the individual social channel. Google Analytics[110] is a free tool that shows the behavior and other information, such as demographics and device type used, of website visitors from social networks. This and other commercial offers can aid marketers in choosing the most effective social networks and social media marketing activities.
More than three billion people in the world are active on the Internet. Over the years, the Internet has continually gained more and more users, jumping from 738 million in 2000 all the way to 3.2 billion in 2015.[9] Roughly 81% of the current population in the United States has some type of social media profile that they engage with frequently.[10] Mobile phone usage is beneficial for social media marketing because of their web browsing capabilities which allow individuals immediate access to social networking sites. Mobile phones have altered the path-to-purchase process by allowing consumers to easily obtain pricing and product information in real time[11]. They have also allowed companies to constantly remind and update their followers. Many companies are now putting QR (Quick Response) codes along with products for individuals to access the company website or online services with their smart phones. Retailers use QR codes to facilitate consumer interaction with brands by linking the code to brand websites, promotions, product information, and any other mobile-enabled content. In addition, Real-time bidding use in the mobile advertising industry is high and rising due to its value for on-the-go web browsing. In 2012, Nexage, a provider of real time bidding in mobile advertising reported a 37% increase in revenue each month. Adfonic, another mobile advertisement publishing platform, reported an increase of 22 billion ad requests that same year.[12]
Unplanned content is an 'in the moment' idea, "a spontaneous, tactical reaction." (Cramer, 2014, p. 6). The content could be trending and not have the time to take the planned content route. The unplanned content is posted sporadically and is not calendar/date/time arranged (Deshpande, 2014).[88][89] Issues with unplanned content revolve around legal issues and whether the message being sent out represents the business/brand accordingly. If a company sends out a Tweet or Facebook message too hurriedly, the company may unintentionally use insensitive language or messaging that could alienate some consumers. For example, celebrity chef Paula Deen was criticized after she made a social media post commenting about HIV-AIDS and South Africa; her message was deemed to be offensive by many observers. The main difference between planned and unplanned is the time to approve the content. Unplanned content must still be approved by marketing managers, but in a much more rapid manner e.g. 1–2 hours or less. Sectors may miss errors because of being hurried. When using unplanned content Brito (2013) says, "be prepared to be reactive and respond to issues when they arise."[87] Brito (2013) writes about having a, "crisis escalation plan", because, "It will happen". The plan involves breaking down the issue into topics and classifying the issue into groups. Colour coding the potential risk "identify and flag potential risks" also helps to organise an issue. The problem can then be handled by the correct team and dissolved more effectively rather than any person at hand trying to solve the situation.[87]
Facebook pages are far more detailed than Twitter accounts. They allow a product to provide videos, photos, longer descriptions, and testimonials where followers can comment on the product pages for others to see. Facebook can link back to the product's Twitter page, as well as send out event reminders. As of May 2015, 93% of businesses marketers use Facebook to promote their brand.[36] A study from 2011 attributed 84% of "engagement" or clicks and likes that link back to Facebook advertising.[37] By 2014, Facebook had restricted the content published from business and brand pages. Adjustments in Facebook algorithms have reduced the audience for non-paying business pages (that have at least 500,000 "Likes") from 16% in 2012 down to 2% in February 2014.[38] [39][40]

Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
We love paid social advertising because it's a highly cost-effective way to expand your reach. If you play your cards right, you can get your content and offers in front of a huge audience at a very low cost. Most social media platforms offer incredibly granular targeting capabilities, allowing you to focus your budget on exactly the types of people that are most likely to be interested in your business. Below are some tips and resources for getting started with paid social media marketing:
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[70][71]
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
×