Instagram has proven itself a powerful platform for marketers to reach their customers and prospects through sharing pictures and brief messages. According to a study by Simply Measured, 71% of the world's largest brands are now using Instagram as a marketing channel.[58] For companies, Instagram can be used as a tool to connect and communicate with current and potential customers. The company can present a more personal picture of their brand, and by doing so the company conveys a better and true picture of itself. The idea of Instagram pictures lies on on-the-go, a sense that the event is happening right now, and that adds another layer to the personal and accurate picture of the company. In fact, Thomas Rankin, co-founder and CEO of the program Dash Hudson, stated that when he approves a blogger's Instagram post before it is posted on the behalf of a brand his company represents, his only negative feedback is if it looks too posed. "It's not an editorial photo," he explained, "We're not trying to be a magazine. We're trying to create a moment."[57] Another option Instagram provides the opportunity for companies to reflect a true picture of the brandfrom the perspective of the customers, for instance, using the user-generated contents thought the hashtags encouragement.[59] Other than the filters and hashtags functions, the Instagram's 15-second videos and the recently added ability to send private messages between users have opened new opportunities for brands to connect with customers in a new extent, further promoting effective marketing on Instagram.
Facebook had an estimated 144.27 million views in 2016, approximately 12.9 million per month.[109] Despite this high volume of traffic, very little has been done to protect the millions of users who log on to Facebook and other social media platforms each month. President Barack Obama tried to work with the Federal Trade Commission (FTC) to attempt to regulate data mining. He proposed the Privacy Bill of Rights, which would protect the average user from having their private information downloaded and shared with third party companies. The proposed laws would give the consumer more control over what information companies can collect.[107] President Obama was unable to pass most of these laws through congress, and it is unsure what President Trump will do with regards to social media marketing ethics.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.

Another excellent guide is Google’s “Search Engine Optimization Starter Guide.” This is a free PDF download that covers basic tips that Google provides to its own employees on how to get listed. You’ll find it here. Also well worth checking out is Moz’s “Beginner’s Guide To SEO,” which you’ll find here, and the SEO Success Pyramid from Small Business Search Marketing.


By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
×