Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
Planned content begins with the creative/marketing team generating their ideas, once they have completed their ideas they send them off for approval. There is two general ways of doing so. The first is where each sector approves the plan one after another, editor, brand, followed by the legal team (Brito, 2013). Sectors may differ depending on the size and philosophy of the business. The second is where each sector is given 24 hours (or such designated time) to sign off or disapprove. If no action is given within the 24-hour period the original plan is implemented. Planned content is often noticeable to customers and is un-original or lacks excitement but is also a safer option to avoid unnecessary backlash from the public.[87] Both routes for planned content are time consuming as in the above; the first way to approval takes 72 hours to be approved. Although the second route can be significantly shorter it also holds more risk particularly in the legal department.

Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.


A breadcrumb is a row of internal links at the top or bottom of the page that allows visitors to quickly navigate back to a previous section or the root page. Many breadcrumbs have the most general page (usually the root page) as the first, leftmost link and list the more specific sections out to the right. We recommend using breadcrumb structured data markup28 when showing breadcrumbs.
Notice that each of these accounts has a consistent voice, tone, and style. Consistency is key to helping your followers understand what to expect from your brand. They’ll know why they should continue to follow you and what value they will get from doing so. It also helps keep your branding consistent even when you have multiple people working on your social team.

Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.

One of the main purposes of employing social media in marketing is as a communications tool that makes the companies accessible to those interested in their product and makes them visible to those who have no knowledge of their products.[26] These companies use social media to create buzz, and learn from and target customers. It's the only form of marketing that can finger consumers at each and every stage of the consumer decision journey.[27] Marketing through social media has other benefits as well. Of the top 10 factors that correlate with a strong Google organic search, seven are social media dependent. This means that if brands are less or non-active on social media, they tend to show up less on Google searches.[28] While platforms such as Twitter, Facebook, and Google+ have a larger number of monthly users, the visual media sharing based mobile platforms, however, garner a higher interaction rate in comparison and have registered the fastest growth and have changed the ways in which consumers engage with brand content. Instagram has an interaction rate of 1.46% with an average of 130 million users monthly as opposed to Twitter which has a .03% interaction rate with an average of 210 million monthly users.[28] Unlike traditional media that are often cost-prohibitive to many companies, a social media strategy does not require astronomical budgeting.[29]


To this end, companies make use of platforms such as Facebook, Twitter, YouTube, and Instagram to reach audiences much wider than through the use of traditional print/TV/radio advertisements alone at a fraction of the cost, as most social networking sites can be used at little or no cost (however, some websites charge companies for premium services). This has changed the ways that companies approach to interact with customers, as a substantial percentage of consumer interactions are now being carried out over online platforms with much higher visibility. Customers can now post reviews of products and services, rate customer service, and ask questions or voice concerns directly to companies through social media platforms. According to Measuring Success, over 80% of consumers use the web to research products and services.[30] Thus social media marketing is also used by businesses in order to build relationships of trust with consumers.[31] To this aim, companies may also hire personnel to specifically handle these social media interactions, who usually report under the title of Online community managers. Handling these interactions in a satisfactory manner can result in an increase of consumer trust. To both this aim and to fix the public's perception of a company, 3 steps are taken in order to address consumer concerns, identifying the extent of the social chatter, engaging the influencers to help, and developing a proportional response.[32]
Social media can be a useful source of market information and a way to hear customer perspectives. Blogs, content communities, and forums are platforms where individuals share their reviews and recommendations of brands, products, and services. Businesses are able to tap and analyze the customer voices and feedback generated in social media for marketing purposes;[15] in this sense the social media is a relatively inexpensive source of market intelligence which can be used by marketers and managers to track and respond to consumer-identified problems and detect market opportunities. For example, the Internet erupted with videos and pictures of iPhone 6 "bend test" which showed that the coveted phone could be bent by hand pressure. The so-called "bend gate" controversy[16] created confusion amongst customers who had waited months for the launch of the latest rendition of the iPhone. However, Apple promptly issued a statement saying that the problem was extremely rare and that the company had taken several steps to make the mobile device's case stronger and robust. Unlike traditional market research methods such as surveys, focus groups, and data mining which are time-consuming and costly, and which take weeks or even months to analyze, marketers can use social media to obtain 'live' or "real time" information about consumer behavior and viewpoints on a company's brand or products. This can be useful in the highly dynamic, competitive, fast-paced and global marketplace of the 2010s.

Disney/Pixar's Monsters University: Created a Tumblr account, MUGrumblr, saying that the account is maintained by a 'Monstropolis transplant' and 'self-diagnosed coffee addict' who is currently a sophomore at Monsters University.[73] A "student" from Monsters University uploaded memes, animated GIFs, and Instagram-like photos that are related to the sequel movie.
Often the line between pay per click advertising and paid inclusion is debatable. Some have lobbied for any paid listings to be labeled as an advertisement, while defenders insist they are not actually ads since the webmasters do not control the content of the listing, its ranking, or even whether it is shown to any users. Another advantage of paid inclusion is that it allows site owners to specify particular schedules for crawling pages. In the general case, one has no control as to when their page will be crawled or added to a search engine index. Paid inclusion proves to be particularly useful for cases where pages are dynamically generated and frequently modified.
Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
He is the co-founder of Neil Patel Digital. The Wall Street Journal calls him a top influencer on the web, Forbes says he is one of the top 10 marketers, and Entrepreneur Magazine says he created one of the 100 most brilliant companies. Neil is a New York Times bestselling author and was recognized as a top 100 entrepreneur under the age of 30 by President Obama and a top 100 entrepreneur under the age of 35 by the United Nations.
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[68][69]
Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3
SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay per click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is the practice of designing, running and optimizing search engine ad campaigns.[56] Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. Its purpose regards prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[57] A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[58] In November 2015, Google released a full 160 page version of its Search Quality Rating Guidelines to the public,[59] which revealed a shift in their focus towards "usefulness" and mobile search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016 where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device [60]. Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and how user-friendly it is.

The world is mobile today. Most people are searching on Google using a mobile device. The desktop version of a site might be difficult to view and use on a mobile device. As a result, having a mobile ready site is critical to your online presence. In fact, starting in late 2016, Google has begun experiments to primarily use the mobile version of a site's content42 for ranking, parsing structured data, and generating snippets.
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.
Reddit, or similar social media platforms such as Stumble Upon or Digg, are ideal for sharing compelling content. With over 2 billion page views a month, Reddit has incredible social media marketing potential, but marketers should be warned that only truly unique, interesting content will be welcomed. Posting on Reddit is playing with fire—submit spammy or overtly sales-focused content and your business could get berated by this extremely tech-savvy community.
In addition, social media platforms have become extremely aware of their users and collect information about their viewers to connect with them in various ways. Social-networking website Facebook Inc. is quietly working on a new advertising system that would let marketers target users with ads based on the massive amounts of information people reveal on the site about themselves.[104] This may be an unethical or ethical feature to some individuals. Some people may react negatively because they believe it is an invasion of privacy. On the other hand, some individuals may enjoy this feature because their social network recognizes their interests and sends them particular advertisements pertaining to those interests. Consumers like to network with people who have interests and desires that are similar to their own.[105] Individuals who agree to have their social media profile public, should be aware that advertisers have the ability to take information that interests them to be able to send them information and advertisements to boost their sales. Managers invest in social media to foster relationships and interact with customers.[106] This is an ethical way for managers to send messages about their advertisements and products to their consumers.
In some contexts, the term SEM is used exclusively to mean pay per click advertising,[2] particularly in the commercial advertising and marketing communities which have a vested interest in this narrow definition. Such usage excludes the wider search marketing community that is engaged in other forms of SEM such as search engine optimization and search retargeting.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
SEM is the wider discipline that incorporates SEO. SEM includes both paid search results (using tools like Google Adwords or Bing Ads, formerly known as Microsoft adCenter) and organic search results (SEO). SEM uses paid advertising with AdWords or Bing Ads, pay per click (particularly beneficial for local providers as it enables potential consumers to contact a company directly with one click), article submissions, advertising and making sure SEO has been done. A keyword analysis is performed for both SEO and SEM, but not necessarily at the same time. SEM and SEO both need to be monitored and updated frequently to reflect evolving best practices.
Traditional advertising techniques include print and television advertising. The Internet has already overtaken television as the largest advertising market.[90] Web sites often include the banner or pop-up ads. Social networking sites don't always have ads. In exchange, products have entire pages and are able to interact with users. Television commercials often end with a spokesperson asking viewers to check out the product website for more information. While briefly popular, print ads included QR codes on them. These QR codes can be scanned by cell phones and computers, sending viewers to the product website. Advertising is beginning to move viewers from the traditional outlets to the electronic ones.[citation needed]

Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic tool: WebSideStory's Hitbox; and (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues and ensuring websites meet W3C code standards. Try to use more than one HTML validator or spider simulator because each one tests, highlights, and reports on slightly different aspects of your website.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb, Adversarial Information Retrieval on the Web was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]
×