A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.

Inclusion in Google's search results is free and easy; you don't even need to submit your site to Google. Google is a fully automated search engine that uses web crawlers to explore the web constantly, looking for sites to add to our index. In fact, the vast majority of sites listed in our results aren't manually submitted for inclusion, but found and added automatically when we crawl the web. Learn how Google discovers, crawls, and serves web pages.3


Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
In March 2006, KinderStart filed a lawsuit against Google over search engine rankings. KinderStart's website was removed from Google's index prior to the lawsuit, and the amount of traffic to the site dropped by 70%. On March 16, 2007, the United States District Court for the Northern District of California (San Jose Division) dismissed KinderStart's complaint without leave to amend, and partially granted Google's motion for Rule 11 sanctions against KinderStart's attorney, requiring him to pay part of Google's legal expenses.[70][71]

In 2012 during Hurricane Sandy, Gap sent out a tweet to its followers telling them to stay safe but encouraged them to shop online and offered free shipping. The tweet was deemed insensitive, and Gap eventually took it down and apologized.[96] Numerous additional online marketing mishap examples exist. Examples include a YouTube video of a Domino's Pizza employee violating health code standards, which went viral on the Internet and later resulted in felony charges against two employees.[93][97] A Twitter hashtag posted by McDonald's in 2012 attracting attention due to numerous complaints and negative events customers experienced at the chain store; and a 2011 tweet posted by a Chrysler Group employee that no one in Detroit knows how to drive.[98] When the Link REIT opened a Facebook page to recommend old-style restaurants, the page was flooded by furious comments criticizing the REIT for having forced a lot of restaurants and stores to shut down; it had to terminate its campaign early amid further deterioration of its corporate image.[99]

Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
Engagement with the social web means that customers and stakeholders are active participants rather than passive viewers. An example of these are consumer advocacy groups and groups that criticize companies (e.g., lobby groups or advocacy organizations). Social media use in a business or political context allows all consumers/citizens to express and share an opinion about a company's products, services, business practices, or a government's actions. Each participating customer, non-customer, or citizen who is participating online via social media becomes a part of the marketing department (or a challenge to the marketing effort). Whereas as other customers read their positive or negative comments or reviews. Getting consumers, potential consumers or citizens to be engaged online is fundamental to successful social media marketing.[20] With the advent of social media marketing, it has become increasingly important to gain customer interest in products and services. This can eventually be translated into buying behavior, or voting and donating behavior in a political context. New online marketing concepts of engagement and loyalty have emerged which aim to build customer participation and brand reputation.[21]

WhatsApp was founded by Jan Koum and Brian Acton.WhatsApp joined Facebook in 2014, but continues to operate as a separate app with a laser focus on building a messaging service that works fast and reliably anywhere in the world.WhatsApp started as an alternative to SMS. Whatsapp now supports sending and receiving a variety of media including text, photos, videos, documents, and location, as well as voice calls. Whatsapp messages and calls are secured with end-to-end encryption, meaning that no third party including WhatsApp can read or listen to them. Whatsapp has a customer base of 1 billion people in over 180 countries.[46][47] It is used to send personalised promotional messages to individual customers. It has plenty of advantages over SMS that includes ability to track how Message Broadcast Performs using blue tick option in Whatsapp. It allows sending messages to Do Not Disturb(DND) customers. Whatsapp is also used to send a series of bulk messages to their targeted customers using broadcast option. Companies started using this to a large extent because it is a cost effective promotional option and quick to spread a message. Still, Whatsapp doesn't allow businesses to place ads in their app.[48]
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
Search engine optimization (SEO) is often about making small modifications to parts of your website. When viewed individually, these changes might seem like incremental improvements, but when combined with other optimizations, they could have a noticeable impact on your site's user experience and performance in organic search results. You're likely already familiar with many of the topics in this guide, because they're essential ingredients for any web page, but you may not be making the most out of them.
Reddit, or similar social media platforms such as Stumble Upon or Digg, are ideal for sharing compelling content. With over 2 billion page views a month, Reddit has incredible social media marketing potential, but marketers should be warned that only truly unique, interesting content will be welcomed. Posting on Reddit is playing with fire—submit spammy or overtly sales-focused content and your business could get berated by this extremely tech-savvy community.
Google recommends that all websites use https:// when possible. The hostname is where your website is hosted, commonly using the same domain name that you'd use for email. Google differentiates between the "www" and "non-www" version (for example, "www.example.com" or just "example.com"). When adding your website to Search Console, we recommend adding both http:// and https:// versions, as well as the "www" and "non-www" versions.
As the number of sites on the Web increased in the mid-to-late 1990s, search engines started appearing to help people find information quickly. Search engines developed business models to finance their services, such as pay per click programs offered by Open Text[7] in 1996 and then Goto.com[8] in 1998. Goto.com later changed its name[9] to Overture in 2001, was purchased by Yahoo! in 2003, and now offers paid search opportunities for advertisers through Yahoo! Search Marketing. Google also began to offer advertisements on search results pages in 2000 through the Google AdWords program. By 2007, pay-per-click programs proved to be primary moneymakers[10] for search engines. In a market dominated by Google, in 2009 Yahoo! and Microsoft announced the intention to forge an alliance. The Yahoo! & Microsoft Search Alliance eventually received approval from regulators in the US and Europe in February 2010.[11]
Organic search (SEO): When you enter a keyword or phrase into a search engine like Google or Yahoo!, the organic results are displayed in the main body of the page.When your prospects search for information about your products and services, you want to rank highly in search engine results. By “optimizing” your site, you can improve your ranking for important search terms and phrases (“keywords”). You can also improve your rank by getting other important sites to link to yours.
In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
×