Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic tool: WebSideStory's Hitbox; and (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues and ensuring websites meet W3C code standards. Try to use more than one HTML validator or spider simulator because each one tests, highlights, and reports on slightly different aspects of your website.
Great Social Content — Consistent with other areas of online marketing, content reigns supreme when it comes to social media marketing. Make sure you post regularly and offer truly valuable information that your ideal customers will find helpful and interesting. The content that you share on your social networks can include social media images, videos, infographics, how-to guides and more.

Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
Social networks are, in many cases, viewed as a great tool for avoiding costly market research. They are known for providing a short, fast, and direct way to reach an audience through a person who is widely known. For example, an athlete who gets endorsed by a sporting goods company also brings their support base of millions of people who are interested in what they do or how they play and now they want to be a part of this athlete through their endorsements with that particular company. At one point consumers would visit stores to view their products with famous athletes, but now you can view a famous athlete's, such as Cristiano Ronaldo, latest apparel online with the click of a button. He advertises them to you directly through his Twitter, Instagram, and FaceBook accounts.
Social media can be used not only as public relations and direct marketing tools but also as communication channels targeting very specific audiences with social media influencers and social media personalities and as effective customer engagement tools.[15] Technologies predating social media, such as broadcast TV and newspapers can also provide advertisers with a fairly targeted audience, given that an ad placed during a sports game broadcast or in the sports section of a newspaper is likely to be read by sports fans. However, social media websites can target niche markets even more precisely. Using digital tools such as Google Adsense, advertisers can target their ads to very specific demographics, such as people who are interested in social entrepreneurship, political activism associated with a particular political party, or video gaming. Google Adsense does this by looking for keywords in social media user's online posts and comments. It would be hard for a TV station or paper-based newspaper to provide ads that are this targeted (though not impossible, as can be seen with "special issue" sections on niche issues, which newspapers can use to sell targeted ads).
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review.[40] Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links[41] in addition to their URL submission console.[42] Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click;[43] however, this practice was discontinued in 2009.
A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[68][69]

App.net Avatars United Bebo Bolt Capazoo eConozco Emojli Eyegroove FitFinder Formspring FriendFeed Friends Reunited Friendster Grono.net Google+ Google Buzz Heello Hyves iTunes Ping iWiW Jaiku LunarStorm Me2day Meerkat Mobli Mugshot Musical.ly Natter Social Network Netlog Orkut Pheed Piczo PlanetAll Posterous Pownce Qaiku SixDegrees.com So.cl Spring.me Surfbook tbh Tribe.net Tsū tvtag Vine Windows Live Spaces Wretch Yahoo! 360° Yahoo! Kickstart Yahoo! Mash Yahoo! Meme Yik Yak
In the social sphere, things change fast. New networks emerge, while others go through significant demographic shifts. Your business will go through periods of change as well. All of this means that your social media strategy should be a living document that you look at regularly and adjust as needed. Refer to it often to keep you on track, but don’t be afraid to make changes so that it better reflects new goals, tools, or plans.
An SEO technique is considered white hat if it conforms to the search engines' guidelines and involves no deception. As the search engine guidelines[18][19][52] are not written as a series of rules or commandments, this is an important distinction to note. White hat SEO is not just about following guidelines but is about ensuring that the content a search engine indexes and subsequently ranks is the same content a user will see. White hat advice is generally summed up as creating content for users, not for search engines, and then making that content easily accessible to the online "spider" algorithms, rather than attempting to trick the algorithm from its intended purpose. White hat SEO is in many ways similar to web development that promotes accessibility,[53] although the two are not identical.
Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic tool: WebSideStory's Hitbox; and (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues and ensuring websites meet W3C code standards. Try to use more than one HTML validator or spider simulator because each one tests, highlights, and reports on slightly different aspects of your website.
Great Social Content — Consistent with other areas of online marketing, content reigns supreme when it comes to social media marketing. Make sure you post regularly and offer truly valuable information that your ideal customers will find helpful and interesting. The content that you share on your social networks can include social media images, videos, infographics, how-to guides and more.
Facebook had an estimated 144.27 million views in 2016, approximately 12.9 million per month.[109] Despite this high volume of traffic, very little has been done to protect the millions of users who log on to Facebook and other social media platforms each month. President Barack Obama tried to work with the Federal Trade Commission (FTC) to attempt to regulate data mining. He proposed the Privacy Bill of Rights, which would protect the average user from having their private information downloaded and shared with third party companies. The proposed laws would give the consumer more control over what information companies can collect.[107] President Obama was unable to pass most of these laws through congress, and it is unsure what President Trump will do with regards to social media marketing ethics.
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
SEO techniques can be classified into two broad categories: techniques that search engine companies recommend as part of good design ("white hat"), and those techniques of which search engines do not approve ("black hat"). The search engines attempt to minimize the effect of the latter, among them spamdexing. Industry commentators have classified these methods, and the practitioners who employ them, as either white hat SEO, or black hat SEO.[50] White hats tend to produce results that last a long time, whereas black hats anticipate that their sites may eventually be banned either temporarily or permanently once the search engines discover what they are doing.[51]

With the development of this system, the price is growing under the high level of competition. Many advertisers prefer to expand their activities, including increasing search engines and adding more keywords. The more advertisers are willing to pay for clicks, the higher the ranking for advertising, which leads to higher traffic.[15] PPC comes at a cost. The higher position is likely to cost $5 for a given keyword, and $4.50 for a third location. A third advertiser earns 10% less than the top advertiser, while reducing traffic by 50%.[15] The investors must consider their return on investment and then determine whether the increase in traffic is worth the increase.

Google's search engine marketing is one of the western world's marketing leaders, while its search engine marketing is its biggest source of profit.[17] Google's search engine providers are clearly ahead of the Yahoo and Bing network. The display of unknown search results is free, while advertisers are willing to pay for each click of the ad in the sponsored search results.
Let’s say, for example, that you run a construction business that helps with home repairs after natural disasters and you want to advertise that service. The official term for the service is “fire restoration,” but keyword research may indicate that customers in your area search instead for “fire repair” or “repair fire damage to house.” By not optimizing for these two keywords, you’ll lose out on a lot of traffic and potential customers, even if “fire restoration” is technically more correct.
All sites have a home or "root" page, which is usually the most frequented page on the site and the starting place of navigation for many visitors. Unless your site has only a handful of pages, you should think about how visitors will go from a general page (your root page) to a page containing more specific content. Do you have enough pages around a specific topic area that it would make sense to create a page describing these related pages (for example, root page -> related topic listing -> specific topic)? Do you have hundreds of different products that need to be classified under multiple category and subcategory pages?
Mix up your official tweets about specials, discounts, and news with fun, brand-building tweets . Be sure to retweet when a customer has something nice to say about you, and don’t forget to answer people’s questions when possible. Using Twitter as a social media marketing tool revolves around dialog and communication, so be sure to interact as much as possible to nurture and build your following.

YouTube is the number one place for creating and sharing video content, and it can also be an incredibly powerful social media marketing tool. Many businesses try to create video content with the aim of having their video “go viral,” but in reality those chances are pretty slim. Instead, focus on creating useful, instructive “how-to” videos. These how-to videos also have the added benefit of ranking on the video search results of Google, so don't under-estimate the power of video content!


Several customers are turning towards social media to express their appreciation or frustration with brands, product or services. Therefore, marketers can measure the frequency of which customers are discussing their brand and judge how effective their SMM strategies are. In recent studies, 72% of people surveyed expressed that they expected a response to their complaints on Twitter within an hour.[111]
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
×