App.net Avatars United Bebo Bolt Capazoo eConozco Emojli Eyegroove FitFinder Formspring FriendFeed Friends Reunited Friendster Grono.net Google+ Google Buzz Heello Hyves iTunes Ping iWiW Jaiku LunarStorm Me2day Meerkat Mobli Mugshot Musical.ly Natter Social Network Netlog Orkut Pheed Piczo PlanetAll Posterous Pownce Qaiku SixDegrees.com So.cl Spring.me Surfbook tbh Tribe.net Tsū tvtag Vine Windows Live Spaces Wretch Yahoo! 360° Yahoo! Kickstart Yahoo! Mash Yahoo! Meme Yik Yak
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
Page and Brin founded Google in 1998.[23] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[24] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[25]
When I started out, I pitched SEO packages to local business owners that I met through networking, which is a good way to start building result-oriented business case studies that show the ROI (return on investment) that has been generated from your efforts. Once you have those and you can prove you consistently get results, you’ll be completely indispensable because nearly every online business succeeds or fails based on the quality of their digital marketing (and people who are really good at it are rare).
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
On April 24, 2012 many started to see that Google has started to penalize companies that are buying links for the purpose of passing off the rank. The Google Update was called Penguin. Since then, there have been several different Penguin/Panda updates rolled out by Google. SEM has, however, nothing to do with link buying and focuses on organic SEO and PPC management. As of October 20, 2014 Google has released three official revisions of their Penguin Update.

Facebook Ads and other social media ad platforms, for example, are pay-per-click platforms that do not fall under the SEM category. Instead of showing your ads to people who are searching for similar content like search ads do, social media sites introduce your product to people who happen to be just browsing through their feeds. These are two very, very different types of online advertising.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
In addition to giving you insight into the search volume and competition level of keywords, most keyword research tools will also give you detailed information about the average or current estimated CPC for particular keywords are. This is particularly important for businesses with smaller ad budgets and this feature allows you to predict whether certain keywords will be truly beneficial to your ad campaigns or if they’ll cost too much.
LinkedIn, a professional business-related networking site, allows companies to create professional profiles for themselves as well as their business to network and meet others.[41] Through the use of widgets, members can promote their various social networking activities, such as Twitter stream or blog entries of their product pages, onto their LinkedIn profile page.[42] LinkedIn provides its members the opportunity to generate sales leads and business partners.[43] Members can use "Company Pages" similar to Facebook pages to create an area that will allow business owners to promote their products or services and be able to interact with their customers.[44] Due to spread of spam mail sent to job seeker, leading companies prefer to use LinkedIn for employee's recruitment instead using different a job portal. Additionally, companies have voiced a preference for the amount of information that can be gleaned from a LinkedIn profile, versus a limited email.[45]
Organic search (SEO): When you enter a keyword or phrase into a search engine like Google or Yahoo!, the organic results are displayed in the main body of the page.When your prospects search for information about your products and services, you want to rank highly in search engine results. By “optimizing” your site, you can improve your ranking for important search terms and phrases (“keywords”). You can also improve your rank by getting other important sites to link to yours.
Facebook and LinkedIn are leading social media platforms where users can hyper-target their ads. Hypertargeting not only uses public profile information but also information users submit but hide from others.[17] There are several examples of firms initiating some form of online dialog with the public to foster relations with customers. According to Constantinides, Lorenzo and Gómez Borja (2008) "Business executives like Jonathan Swartz, President and CEO of Sun Microsystems, Steve Jobs CEO of Apple Computers, and McDonalds Vice President Bob Langert post regularly in their CEO blogs, encouraging customers to interact and freely express their feelings, ideas, suggestions, or remarks about their postings, the company or its products".[15] Using customer influencers (for example popular bloggers) can be a very efficient and cost-effective method to launch new products or services[18] Among the political leaders in office, Prime Minister Narendra Modi has the highest number of followers at 40 million, and President Donald Trump ranks second with 25 million followers.[19] Modi employed social media platforms to circumvent traditional media channels to reach out to the young and urban population of India which is estimated to be 200 million.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
×