Sharing Curated Links — While using social media for marketing is a great way to leverage your own unique, original content to gain followers, fans, and devotees, it’s also an opportunity to link to outside articles as well. If other sources provide great, valuable information you think your target audience will enjoy, don’t be shy about linking to them. Curating and linking to outside sources improves trust and reliability, and you may even get some links in return.
Imagine that you've created the definitive Web site on a subject -- we'll use skydiving as an example. Your site is so new that it's not even listed on any SERPs yet, so your first step is to submit your site to search engines like Google and Yahoo. The Web pages on your skydiving site include useful information, exciting photographs and helpful links guiding visitors to other resources. Even with the best information about skydiving on the Web, your site may not crack the top page of results on major search engines. When people search for the term "skydiving," they could end up going to inferior Web sites because yours isn't in the top results.
Blogging website Tumblr first launched ad products on May 29, 2012. Rather than relying on simple banner ads, Tumblr requires advertisers to create a Tumblr blog so the content of those blogs can be featured on the site. In one year, four native ad formats were created on web and mobile, and had more than 100 brands advertising on Tumblr with 500 cumulative sponsored posts.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique. The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine. Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words . With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
Twitter allows companies to promote their products in short messages known as tweets limited to 140 characters which appear on followers' Home timelines. Tweets can contain text, Hashtag, photo, video, Animated GIF, Emoji, or links to the product's website and other social media profiles, etc. Twitter is also used by companies to provide customer service. Some companies make support available 24/7 and answer promptly, thus improving brand loyalty and appreciation.
Social media itself is a catch-all term for sites that may provide radically different social actions. For instance, Twitter is a social site designed to let people share short messages or “updates” with others. Facebook, in contrast is a full-blown social networking site that allows for sharing updates, photos, joining events and a variety of other activities.
Paid search advertising has not been without controversy and the issue of how search engines present advertising on their search result pages has been the target of a series of studies and reports by Consumer Reports WebWatch. The Federal Trade Commission (FTC) also issued a letter in 2002 about the importance of disclosure of paid advertising on search engines, in response to a complaint from Commercial Alert, a consumer advocacy group with ties to Ralph Nader.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines. By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.