By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals. Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.
Social networking websites allow individuals, businesses and other organizations to interact with one another and build relationships and communities online. When companies join these social channels, consumers can interact with them directly. That interaction can be more personal to users than traditional methods of outbound marketing and advertising. Social networking sites act as word of mouth or more precisely, e-word of mouth. The Internet's ability to reach billions across the globe has given online word of mouth a powerful voice and far reach. The ability to rapidly change buying patterns and product or service acquisition and activity to a growing number of consumers is defined as an influence network. Social networking sites and blogs allow followers to "retweet" or "repost" comments made by others about a product being promoted, which occurs quite frequently on some social media sites. By repeating the message, the user's connections are able to see the message, therefore reaching more people. Because the information about the product is being put out there and is getting repeated, more traffic is brought to the product/company.
Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3
In 2012 during Hurricane Sandy, Gap sent out a tweet to its followers telling them to stay safe but encouraged them to shop online and offered free shipping. The tweet was deemed insensitive, and Gap eventually took it down and apologized. Numerous additional online marketing mishap examples exist. Examples include a YouTube video of a Domino's Pizza employee violating health code standards, which went viral on the Internet and later resulted in felony charges against two employees. A Twitter hashtag posted by McDonald's in 2012 attracting attention due to numerous complaints and negative events customers experienced at the chain store; and a 2011 tweet posted by a Chrysler Group employee that no one in Detroit knows how to drive. When the Link REIT opened a Facebook page to recommend old-style restaurants, the page was flooded by furious comments criticizing the REIT for having forced a lot of restaurants and stores to shut down; it had to terminate its campaign early amid further deterioration of its corporate image.
Measuring Success with Analytics — You can’t determine the success of your social media marketing strategies without tracking data. Google Analytics can be used as a great social media marketing tool that will help you measure your most triumphant social media marketing techniques, as well as determine which strategies are better off abandoned. Attach tracking tags to your social media marketing campaigns so that you can properly monitor them. And be sure to use the analytics within each social platform for even more insight into which of your social content is performing best with your audience.
Website saturation and popularity, or how much presence a website has on search engines, can be analyzed through the number of pages of the site that are indexed by search engines (saturation) and how many backlinks the site has (popularity). It requires pages to contain keywords people are looking for and ensure that they rank high enough in search engine rankings. Most search engines include some form of link popularity in their ranking algorithms. The following are major tools measuring various aspects of saturation and link popularity: Link Popularity, Top 10 Google Analysis, and Marketleap's Link Popularity and Search Engine Saturation.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results. On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..." Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.
Twitter allows companies to promote their products in short messages known as tweets limited to 140 characters which appear on followers' Home timelines. Tweets can contain text, Hashtag, photo, video, Animated GIF, Emoji, or links to the product's website and other social media profiles, etc. Twitter is also used by companies to provide customer service. Some companies make support available 24/7 and answer promptly, thus improving brand loyalty and appreciation.
Facebook and LinkedIn are leading social media platforms where users can hyper-target their ads. Hypertargeting not only uses public profile information but also information users submit but hide from others. There are several examples of firms initiating some form of online dialog with the public to foster relations with customers. According to Constantinides, Lorenzo and Gómez Borja (2008) "Business executives like Jonathan Swartz, President and CEO of Sun Microsystems, Steve Jobs CEO of Apple Computers, and McDonalds Vice President Bob Langert post regularly in their CEO blogs, encouraging customers to interact and freely express their feelings, ideas, suggestions, or remarks about their postings, the company or its products". Using customer influencers (for example popular bloggers) can be a very efficient and cost-effective method to launch new products or services Among the political leaders in office, Prime Minister Narendra Modi has the highest number of followers at 40 million, and President Donald Trump ranks second with 25 million followers. Modi employed social media platforms to circumvent traditional media channels to reach out to the young and urban population of India which is estimated to be 200 million.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed. The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.