Instagram has proven itself a powerful platform for marketers to reach their customers and prospects through sharing pictures and brief messages. According to a study by Simply Measured, 71% of the world's largest brands are now using Instagram as a marketing channel.[58] For companies, Instagram can be used as a tool to connect and communicate with current and potential customers. The company can present a more personal picture of their brand, and by doing so the company conveys a better and true picture of itself. The idea of Instagram pictures lies on on-the-go, a sense that the event is happening right now, and that adds another layer to the personal and accurate picture of the company. In fact, Thomas Rankin, co-founder and CEO of the program Dash Hudson, stated that when he approves a blogger's Instagram post before it is posted on the behalf of a brand his company represents, his only negative feedback is if it looks too posed. "It's not an editorial photo," he explained, "We're not trying to be a magazine. We're trying to create a moment."[57] Another option Instagram provides the opportunity for companies to reflect a true picture of the brandfrom the perspective of the customers, for instance, using the user-generated contents thought the hashtags encouragement.[59] Other than the filters and hashtags functions, the Instagram's 15-second videos and the recently added ability to send private messages between users have opened new opportunities for brands to connect with customers in a new extent, further promoting effective marketing on Instagram.

Since social media marketing first came to be, strategists and markets have been getting smarter and more careful with the way they go about collecting information and distributing advertisements. With the presence of data collecting companies, there is no longer a need to target specific audiences. This can be seen as a large ethical gray area. For many users, this is a breach of privacy, but there are no laws that prevent these companies from using the information provided on their websites. Companies like Equifax, Inc., TransUnion Corp, and LexisNexis Group thrive on collecting and sharing personal information of social media users.[107] In 2012, Facebook purchased information from 70 million households from a third party company called Datalogix. Facebook later revealed that they purchased the information in order to create a more efficient advertising service.[108]
Search engines use complex mathematical algorithms to interpret which websites a user seeks. In this diagram, if each bubble represents a website, programs sometimes called spiders examine which sites link to which other sites, with arrows representing these links. Websites getting more inbound links, or stronger links, are presumed to be more important and what the user is searching for. In this example, since website B is the recipient of numerous inbound links, it ranks more highly in a web search. And the links "carry through", such that website C, even though it only has one inbound link, has an inbound link from a highly popular site (B) while site E does not. Note: Percentages are rounded.
Think about the words that a user might search for to find a piece of your content. Users who know a lot about the topic might use different keywords in their search queries than someone who is new to the topic. For example, a long-time football fan might search for [fifa], an acronym for the Fédération Internationale de Football Association, while a new fan might use a more general query like [football playoffs]. Anticipating these differences in search behavior and accounting for them while writing your content (using a good mix of keyword phrases) could produce positive results. Google Ads provides a handy Keyword Planner34 that helps you discover new keyword variations and see the approximate search volume for each keyword. Also, Google Search Console provides you with the top search queries your site appears for and the ones that led the most users to your site in the Performance Report35.
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
The Internet and social networking leaks are one of the issues facing traditional advertising. Video and print ads are often leaked to the world via the Internet earlier than they are scheduled to premiere. Social networking sites allow those leaks to go viral, and be seen by many users more quickly. The time difference is also a problem facing traditional advertisers. When social events occur and are broadcast on television, there is often a time delay between airings on the east coast and west coast of the United States. Social networking sites have become a hub of comment and interaction concerning the event. This allows individuals watching the event on the west coast (time-delayed) to know the outcome before it airs. The 2011 Grammy Awards highlighted this problem. Viewers on the west coast learned who won different awards based on comments made on social networking sites by individuals watching live on the east coast.[92] Since viewers knew who won already, many tuned out and ratings were lower. All the advertisement and promotion put into the event was lost because viewers didn't have a reason to watch.[according to whom?]
Sponsored radar – Radar picks up exceptional posts from the whole Tumblr community based on their originality and creativity. It is placed on the right side next to the Dashboard, and it typically earns 120 million daily impressions. Sponsored radar allows advertisers to place their posts there to have an opportunity to earn new followers, reblogs, and likes.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
×