Traditional advertising techniques include print and television advertising. The Internet has already overtaken television as the largest advertising market.[90] Web sites often include the banner or pop-up ads. Social networking sites don't always have ads. In exchange, products have entire pages and are able to interact with users. Television commercials often end with a spokesperson asking viewers to check out the product website for more information. While briefly popular, print ads included QR codes on them. These QR codes can be scanned by cell phones and computers, sending viewers to the product website. Advertising is beginning to move viewers from the traditional outlets to the electronic ones.[citation needed]

In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.

AdWords is recognized as a web-based advertising utensil since it adopts keywords which can deliver adverts explicitly to web users looking for information in respect to a certain product or service. It is flexible and provides customizable options like Ad Extensions, access to non-search sites, leveraging the display network to help increase brand awareness. The project hinges on cost per click (CPC) pricing where the maximum cost per day for the campaign can be chosen, thus the payment of the service only applies if the advert has been clicked. SEM companies have embarked on AdWords projects as a way to publicize their SEM and SEO services. One of the most successful approaches to the strategy of this project was to focus on making sure that PPC advertising funds were prudently invested. Moreover, SEM companies have described AdWords as a practical tool for increasing a consumer’s investment earnings on Internet advertising. The use of conversion tracking and Google Analytics tools was deemed to be practical for presenting to clients the performance of their canvas from click to conversion. AdWords project has enabled SEM companies to train their clients on the utensil and delivers better performance to the canvass. The assistance of AdWord canvass could contribute to the growth of web traffic for a number of its consumer’s websites, by as much as 250% in only nine months.[30]
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.

Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.