Search engine optimization consultants expanded their offerings to help businesses learn about and use the advertising opportunities offered by search engines, and new agencies focusing primarily upon marketing and advertising through search engines emerged. The term "search engine marketing" was popularized by Danny Sullivan in 2001[12] to cover the spectrum of activities involved in performing SEO, managing paid listings at the search engines, submitting sites to directories, and developing online marketing strategies for businesses, organizations, and individuals.
Popular social media such as Facebook, Twitter, LinkedIn, and other social networks can provide marketers with a hard number of how large their audience is nevertheless a large audience may not always translate into a large sales volumes. Therefore, an effective SMM cannot be measured by a large audience but rather by vigorous audience activity such as social shares, re-tweets etc.
Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3
YouTube is the number one place for creating and sharing video content, and it can also be an incredibly powerful social media marketing tool. Many businesses try to create video content with the aim of having their video “go viral,” but in reality those chances are pretty slim. Instead, focus on creating useful, instructive “how-to” videos. These how-to videos also have the added benefit of ranking on the video search results of Google, so don't under-estimate the power of video content!
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
×