Yelp consists of a comprehensive online index of business profiles. Businesses are searchable by location, similar to Yellow Pages. The website is operational in seven different countries, including the United States and Canada. Business account holders are allowed to create, share, and edit business profiles. They may post information such as the business location, contact information, pictures, and service information. The website further allows individuals to write, post reviews about businesses, and rate them on a five-point scale. Messaging and talk features are further made available for general members of the website, serving to guide thoughts and opinions.[49]
Social media itself is a catch-all term for sites that may provide radically different social actions. For instance, Twitter is a social site designed to let people share short messages or “updates” with others. Facebook, in contrast is a full-blown social networking site that allows for sharing updates, photos, joining events and a variety of other activities.
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.

The code of ethics that is affiliated with traditional marketing can also be applied to social media. However, with social media being so personal and international, there is another list of complications and challenges that come along with being ethical online. With the invention of social media, the marketer no longer has to focus solely on the basic demographics and psychographics given from television and magazines, but now they can see what consumers like to hear from advertisers, how they engage online, and what their needs and wants are.[101] The general concept of being ethical while marking on social network sites is to be honest with the intentions of the campaign, avoid false advertising, be aware of user privacy conditions (which means not using consumers' private information for gain), respect the dignity of persons in the shared online community, and claim responsibility for any mistakes or mishaps that are results of your marketing campaign.[102] Most social network marketers use websites like Facebook and MySpace to try to drive traffic to another website.[103] While it is ethical to use social networking websites to spread a message to people who are genuinely interested, many people game the system with auto-friend adding programs and spam messages and bulletins. Social networking websites are becoming wise to these practices, however, and are effectively weeding out and banning offenders.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
Traditional advertising techniques include print and television advertising. The Internet has already overtaken television as the largest advertising market.[90] Web sites often include the banner or pop-up ads. Social networking sites don't always have ads. In exchange, products have entire pages and are able to interact with users. Television commercials often end with a spokesperson asking viewers to check out the product website for more information. While briefly popular, print ads included QR codes on them. These QR codes can be scanned by cell phones and computers, sending viewers to the product website. Advertising is beginning to move viewers from the traditional outlets to the electronic ones.[citation needed]

Search engines reward you when sites link to yours – they assume that your site must be valuable and you’ll rank higher in search results. And the higher the “rank” of the sites that link to you, the more they count in your own ranking. You want links from popular industry authorities, recognized directories, and reputable companies and organizations.

The fee structure is both a filter against superfluous submissions and a revenue generator. Typically, the fee covers an annual subscription for one webpage, which will automatically be catalogued on a regular basis. However, some companies are experimenting with non-subscription based fee structures where purchased listings are displayed permanently. A per-click fee may also apply. Each search engine is different. Some sites allow only paid inclusion, although these have had little success. More frequently, many search engines, like Yahoo!,[18] mix paid inclusion (per-page and per-click fee) with results from web crawling. Others, like Google (and as of 2006, Ask.com[19][20]), do not let webmasters pay to be in their search engine listing (advertisements are shown separately and labeled as such).

In 1998, two graduate students at Stanford University, Larry Page and Sergey Brin, developed "Backrub", a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calculated by the algorithm, PageRank, is a function of the quantity and strength of inbound links.[22] PageRank estimates the likelihood that a given page will be reached by a web user who randomly surfs the web, and follows links from one page to another. In effect, this means that some links are stronger than others, as a higher PageRank page is more likely to be reached by the random web surfer.
×