The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content42 for ranking, parsing structured data, and generating snippets.

When I started out, I pitched SEO packages to local business owners that I met through networking, which is a good way to start building result-oriented business case studies that show the ROI (return on investment) that has been generated from your efforts. Once you have those and you can prove you consistently get results, you’ll be completely indispensable because nearly every online business succeeds or fails based on the quality of their digital marketing (and people who are really good at it are rare).

Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic tool: WebSideStory's Hitbox; and (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues and ensuring websites meet W3C code standards. Try to use more than one HTML validator or spider simulator because each one tests, highlights, and reports on slightly different aspects of your website.
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.
Social media can be a useful source of market information and a way to hear customer perspectives. Blogs, content communities, and forums are platforms where individuals share their reviews and recommendations of brands, products, and services. Businesses are able to tap and analyze the customer voices and feedback generated in social media for marketing purposes;[15] in this sense the social media is a relatively inexpensive source of market intelligence which can be used by marketers and managers to track and respond to consumer-identified problems and detect market opportunities. For example, the Internet erupted with videos and pictures of iPhone 6 "bend test" which showed that the coveted phone could be bent by hand pressure. The so-called "bend gate" controversy[16] created confusion amongst customers who had waited months for the launch of the latest rendition of the iPhone. However, Apple promptly issued a statement saying that the problem was extremely rare and that the company had taken several steps to make the mobile device's case stronger and robust. Unlike traditional market research methods such as surveys, focus groups, and data mining which are time-consuming and costly, and which take weeks or even months to analyze, marketers can use social media to obtain 'live' or "real time" information about consumer behavior and viewpoints on a company's brand or products. This can be useful in the highly dynamic, competitive, fast-paced and global marketplace of the 2010s.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
×