Engagement in social media for the purpose of a social media strategy is divided into two parts. The first is proactive, regular posting of new online content. This can be seen through digital photos, digital videos, text, and conversations. It is also represented through sharing of content and information from others via weblinks. The second part is reactive conversations with social media users responding to those who reach out to your social media profiles through commenting or messaging.[22] Traditional media such as TV news shows are limited to one-way interaction with customers or 'push and tell' where only specific information is given to the customer with few or limited mechanisms to obtain customer feedback. Traditional media such as physical newspapers, do give readers the option of sending a letter to the editor. Though, this is a relatively slow process, as the editorial board has to review the letter and decide if it is appropriate for publication. On the other hand, social media is participative and open; Participants are able to instantly share their views on brands, products, and services. Traditional media gave control of message to the marketer, whereas social media shifts the balance to the consumer or citizen.

Snapchat is a popular messaging and picture exchanging application that was created in 2011 by three students at Stanford University named Evan Spiegel, Bobby Murphy, and Reggie Brown. The application was first developed to allow users to message back and forth and to also send photographs that are only available from 1–10 seconds until they are no longer available. The app was an instant hit with social media members and today there are up to 158 million people using snapchat every single day.[60] It is also estimated that Snapchat users are opening the application approximately 18 times per day, which means users are on the app for about 25–30 minutes per day.[60]
Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
Write a description that would both inform and interest users if they saw your description meta tag as a snippet in a search result. While there's no minimal or maximal length for the text in a description meta tag, we recommend making sure that it's long enough to be fully shown in Search (note that users may see different sized snippets depending on how and where they search), and contains all the relevant information users would need to determine whether the page will be useful and relevant to them.
Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.
Yelp consists of a comprehensive online index of business profiles. Businesses are searchable by location, similar to Yellow Pages. The website is operational in seven different countries, including the United States and Canada. Business account holders are allowed to create, share, and edit business profiles. They may post information such as the business location, contact information, pictures, and service information. The website further allows individuals to write, post reviews about businesses, and rate them on a five-point scale. Messaging and talk features are further made available for general members of the website, serving to guide thoughts and opinions.[49]
Search engine optimization consultants expanded their offerings to help businesses learn about and use the advertising opportunities offered by search engines, and new agencies focusing primarily upon marketing and advertising through search engines emerged. The term "search engine marketing" was popularized by Danny Sullivan in 2001[12] to cover the spectrum of activities involved in performing SEO, managing paid listings at the search engines, submitting sites to directories, and developing online marketing strategies for businesses, organizations, and individuals.
In addition to giving you insight into the search volume and competition level of keywords, most keyword research tools will also give you detailed information about the average or current estimated CPC for particular keywords are. This is particularly important for businesses with smaller ad budgets and this feature allows you to predict whether certain keywords will be truly beneficial to your ad campaigns or if they’ll cost too much.

Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.


On Google+ you can upload and share photos, videos, links, and view all your +1s. Also take advantage of Google+ circles, which allow you to segment your followers into smaller groups, enabling you to share information with some followers while barring others. For example, you might try creating a “super-fan” circle, and share special discounts and exclusive offers only with that group.
Websites such as Delicious, Digg, Slashdot, Diigo, Stumbleupon, and Reddit are popular social bookmarking sites used in social media promotion. Each of these sites is dedicated to the collection, curation, and organization of links to other websites that users deem to be of good quality. This process is "crowdsourced", allowing amateur social media network members to sort and prioritize links by relevance and general category. Due to the large user bases of these websites, any link from one of them to another, the smaller website may in a flash crowd, a sudden surge of interest in the target website. In addition to user-generated promotion, these sites also offer advertisements within individual user communities and categories.[62] Because ads can be placed in designated communities with a very specific target audience and demographic, they have far greater potential for traffic generation than ads selected simply through cookie and browser history.[63] Additionally, some of these websites have also implemented measures to make ads more relevant to users by allowing users to vote on which ones will be shown on pages they frequent.[64] The ability to redirect large volumes of web traffic and target specific, relevant audiences makes social bookmarking sites a valuable asset for social media marketers.
Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic tool: WebSideStory's Hitbox; and (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues and ensuring websites meet W3C code standards. Try to use more than one HTML validator or spider simulator because each one tests, highlights, and reports on slightly different aspects of your website.
In February 2011, Google announced the Panda update, which penalizes websites containing content duplicated from other websites and sources. Historically websites have copied content from one another and benefited in search engine rankings by engaging in this practice. However, Google implemented a new system which punishes sites whose content is not unique.[36] The 2012 Google Penguin attempted to penalize websites that used manipulative techniques to improve their rankings on the search engine.[37] Although Google Penguin has been presented as an algorithm aimed at fighting web spam, it really focuses on spammy links[38] by gauging the quality of the sites the links are coming from. The 2013 Google Hummingbird update featured an algorithm change designed to improve Google's natural language processing and semantic understanding of web pages. Hummingbird's language processing system falls under the newly recognized term of 'Conversational Search' where the system pays more attention to each word in the query in order to better match the pages to the meaning of the query rather than a few words [39]. With regards to the changes made to search engine optimization, for content publishers and writers, Hummingbird is intended to resolve issues by getting rid of irrelevant content and spam, allowing Google to produce high-quality content and rely on them to be 'trusted' authors.
×