Another ethical controversy associated with search marketing has been the issue of trademark infringement. The debate as to whether third parties should have the right to bid on their competitors' brand names has been underway for years. In 2009 Google changed their policy, which formerly prohibited these tactics, allowing 3rd parties to bid on branded terms as long as their landing page in fact provides information on the trademarked term.[27] Though the policy has been changed this continues to be a source of heated debate.[28]
In 2012 during Hurricane Sandy, Gap sent out a tweet to its followers telling them to stay safe but encouraged them to shop online and offered free shipping. The tweet was deemed insensitive, and Gap eventually took it down and apologized.[96] Numerous additional online marketing mishap examples exist. Examples include a YouTube video of a Domino's Pizza employee violating health code standards, which went viral on the Internet and later resulted in felony charges against two employees.[93][97] A Twitter hashtag posted by McDonald's in 2012 attracting attention due to numerous complaints and negative events customers experienced at the chain store; and a 2011 tweet posted by a Chrysler Group employee that no one in Detroit knows how to drive.[98] When the Link REIT opened a Facebook page to recommend old-style restaurants, the page was flooded by furious comments criticizing the REIT for having forced a lot of restaurants and stores to shut down; it had to terminate its campaign early amid further deterioration of its corporate image.[99]
Some search engines have also reached out to the SEO industry, and are frequent sponsors and guests at SEO conferences, webchats, and seminars. Major search engines provide information and guidelines to help with website optimization.[18][19] Google has a Sitemaps program to help webmasters learn if Google is having any problems indexing their website and also provides data on Google traffic to the website.[20] Bing Webmaster Tools provides a way for webmasters to submit a sitemap and web feeds, allows users to determine the "crawl rate", and track the web pages index status.
Social networking websites are based on building virtual communities that allow consumers to express their needs, wants and values, online. Social media marketing then connects these consumers and audiences to businesses that share the same needs, wants, and values. Through social networking sites, companies can keep in touch with individual followers. This personal interaction can instill a feeling of loyalty into followers and potential customers. Also, by choosing whom to follow on these sites, products can reach a very narrow target audience.[4] Social networking sites also include much information about what products and services prospective clients might be interested in. Through the use of new semantic analysis technologies, marketers can detect buying signals, such as content shared by people and questions posted online. An understanding of buying signals can help sales people target relevant prospects and marketers run micro-targeted campaigns.
Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
Social Media for Content Promotion — Social media marketing is a perfect channel for sharing your best site and blog content with readers. Once you build a loyal following on social media, you'll be able to post all your new content and make sure your readers can find new stuff right away. Plus, great blog content will help you build more followers. It's a surprising way that content marketing and social media marketing benefit each other.
Social media marketing, or SMM, is a form of internet marketing that involves creating and sharing content on social media networks in order to achieve your marketing and branding goals. Social media marketing includes activities like posting text and image updates, videos, and and other content that drives audience engagement, as well as paid social media advertising.
Page and Brin founded Google in 1998.[23] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[24] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[25]
A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.

The fee structure is both a filter against superfluous submissions and a revenue generator. Typically, the fee covers an annual subscription for one webpage, which will automatically be catalogued on a regular basis. However, some companies are experimenting with non-subscription based fee structures where purchased listings are displayed permanently. A per-click fee may also apply. Each search engine is different. Some sites allow only paid inclusion, although these have had little success. More frequently, many search engines, like Yahoo!,[18] mix paid inclusion (per-page and per-click fee) with results from web crawling. Others, like Google (and as of 2006, Ask.com[19][20]), do not let webmasters pay to be in their search engine listing (advertisements are shown separately and labeled as such).
Keyword research and analysis involves three "steps": ensuring the site can be indexed in the search engines, finding the most relevant and popular keywords for the site and its products, and using those keywords on the site in a way that will generate and convert traffic. A follow-on effect of keyword analysis and research is the search perception impact.[13] Search perception impact describes the identified impact of a brand's search results on consumer perception, including title and meta tags, site indexing, and keyword focus. As online searching is often the first step for potential consumers/customers, the search perception impact shapes the brand impression for each individual.
Popular social media such as Facebook, Twitter, LinkedIn, and other social networks can provide marketers with a hard number of how large their audience is nevertheless a large audience may not always translate into a large sales volumes. Therefore, an effective SMM cannot be measured by a large audience but rather by vigorous audience activity such as social shares, re-tweets etc.
Measuring Success with Analytics — You can’t determine the success of your social media marketing strategies without tracking data. Google Analytics can be used as a great social media marketing tool that will help you measure your most triumphant social media marketing techniques, as well as determine which strategies are better off abandoned. Attach tracking tags to your social media marketing campaigns so that you can properly monitor them. And be sure to use the analytics within each social platform for even more insight into which of your social content is performing best with your audience.

As the number of sites on the Web increased in the mid-to-late 1990s, search engines started appearing to help people find information quickly. Search engines developed business models to finance their services, such as pay per click programs offered by Open Text[7] in 1996 and then Goto.com[8] in 1998. Goto.com later changed its name[9] to Overture in 2001, was purchased by Yahoo! in 2003, and now offers paid search opportunities for advertisers through Yahoo! Search Marketing. Google also began to offer advertisements on search results pages in 2000 through the Google AdWords program. By 2007, pay-per-click programs proved to be primary moneymakers[10] for search engines. In a market dominated by Google, in 2009 Yahoo! and Microsoft announced the intention to forge an alliance. The Yahoo! & Microsoft Search Alliance eventually received approval from regulators in the US and Europe in February 2010.[11]
Creating the link between SEO and PPC represents an integral part of the SEM concept. Sometimes, especially when separate teams work on SEO and PPC and the efforts are not synced, positive results of aligning their strategies can be lost. The aim of both SEO and PPC is maximizing the visibility in search and thus, their actions to achieve it should be centrally coordinated. Both teams can benefit from setting shared goals and combined metrics, evaluating data together to determine future strategy or discuss which of the tools works better to get the traffic for selected keywords in the national and local search results. Thanks to this, the search visibility can be increased along with optimizing both conversions and costs.[21]
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Unplanned content is an 'in the moment' idea, "a spontaneous, tactical reaction." (Cramer, 2014, p. 6). The content could be trending and not have the time to take the planned content route. The unplanned content is posted sporadically and is not calendar/date/time arranged (Deshpande, 2014).[88][89] Issues with unplanned content revolve around legal issues and whether the message being sent out represents the business/brand accordingly. If a company sends out a Tweet or Facebook message too hurriedly, the company may unintentionally use insensitive language or messaging that could alienate some consumers. For example, celebrity chef Paula Deen was criticized after she made a social media post commenting about HIV-AIDS and South Africa; her message was deemed to be offensive by many observers. The main difference between planned and unplanned is the time to approve the content. Unplanned content must still be approved by marketing managers, but in a much more rapid manner e.g. 1–2 hours or less. Sectors may miss errors because of being hurried. When using unplanned content Brito (2013) says, "be prepared to be reactive and respond to issues when they arise."[87] Brito (2013) writes about having a, "crisis escalation plan", because, "It will happen". The plan involves breaking down the issue into topics and classifying the issue into groups. Colour coding the potential risk "identify and flag potential risks" also helps to organise an issue. The problem can then be handled by the correct team and dissolved more effectively rather than any person at hand trying to solve the situation.[87]
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
In 2007, U.S. advertisers spent US $24.6 billion on search engine marketing.[3] In Q2 2015, Google (73.7%) and the Yahoo/Bing (26.3%) partnership accounted for almost 100% of U.S. search engine spend.[4] As of 2006, SEM was growing much faster than traditional advertising and even other channels of online marketing.[5] Managing search campaigns is either done directly with the SEM vendor or through an SEM tool provider. It may also be self-serve or through an advertising agency. As of October 2016, Google leads the global search engine market with a market share of 89.3%. Bing comes second with a market share of 4.36%, Yahoo comes third with a market share of 3.3%, and Chinese search engine Baidu is fourth globally with a share of about 0.68%.[6]
Mobile devices have become increasingly popular, where 5.7 billion people are using them worldwide [13]. This has played a role in the way consumers interact with media and has many further implications for TV ratings, advertising, mobile commerce, and more. Mobile media consumption such as mobile audio streaming or mobile video are on the rise – In the United States, more than 100 million users are projected to access online video content via mobile device. Mobile video revenue consists of pay-per-view downloads, advertising and subscriptions. As of 2013, worldwide mobile phone Internet user penetration was 73.4%. In 2017, figures suggest that more than 90% of Internet users will access online content through their phones.[14]

Another ethical controversy associated with search marketing has been the issue of trademark infringement. The debate as to whether third parties should have the right to bid on their competitors' brand names has been underway for years. In 2009 Google changed their policy, which formerly prohibited these tactics, allowing 3rd parties to bid on branded terms as long as their landing page in fact provides information on the trademarked term.[27] Though the policy has been changed this continues to be a source of heated debate.[28]

Search engines reward you when sites link to yours – they assume that your site must be valuable and you’ll rank higher in search results. And the higher the “rank” of the sites that link to you, the more they count in your own ranking. You want links from popular industry authorities, recognized directories, and reputable companies and organizations.
The platform of social media is another channel or site that business' and brands must seek to influence the content of. In contrast with pre-Internet marketing, such as TV ads and newspaper ads, in which the marketer controlled all aspects of the ad, with social media, users are free to post comments right below an online ad or an online post by a company about its product. Companies are increasing using their social media strategy as part of their traditional marketing effort using magazines, newspapers, radio advertisements, television advertisements. Since in the 2010s, media consumers are often using multiple platforms at the same time (e.g., surfing the Internet on a tablet while watching a streaming TV show), marketing content needs to be consistent across all platforms, whether traditional or new media. Heath (2006) wrote about the extent of attention businesses should give to their social media sites. It is about finding a balance between frequently posting but not over posting. There is a lot more attention to be paid towards social media sites because people need updates to gain brand recognition. Therefore, a lot more content is need and this can often be unplanned content.[86]

Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
×