Creating the link between SEO and PPC represents an integral part of the SEM concept. Sometimes, especially when separate teams work on SEO and PPC and the efforts are not synced, positive results of aligning their strategies can be lost. The aim of both SEO and PPC is maximizing the visibility in search and thus, their actions to achieve it should be centrally coordinated. Both teams can benefit from setting shared goals and combined metrics, evaluating data together to determine future strategy or discuss which of the tools works better to get the traffic for selected keywords in the national and local search results. Thanks to this, the search visibility can be increased along with optimizing both conversions and costs.[21]
Smartphone - In this document, "mobile" or “mobile devices" refers to smartphones, such as devices running Android, iPhone, or Windows Phone. Mobile browsers are similar to desktop browsers in that they can render a broad set of the HTML5 specification, although their screen size is smaller and in almost all cases their default orientation is vertical.
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[61] Search engines can change their algorithms, impacting a website's placement, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[62] It is considered a wise business practice for website operators to liberate themselves from dependence on search engine traffic.[63] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.
Instagram has proven itself a powerful platform for marketers to reach their customers and prospects through sharing pictures and brief messages. According to a study by Simply Measured, 71% of the world's largest brands are now using Instagram as a marketing channel.[58] For companies, Instagram can be used as a tool to connect and communicate with current and potential customers. The company can present a more personal picture of their brand, and by doing so the company conveys a better and true picture of itself. The idea of Instagram pictures lies on on-the-go, a sense that the event is happening right now, and that adds another layer to the personal and accurate picture of the company. In fact, Thomas Rankin, co-founder and CEO of the program Dash Hudson, stated that when he approves a blogger's Instagram post before it is posted on the behalf of a brand his company represents, his only negative feedback is if it looks too posed. "It's not an editorial photo," he explained, "We're not trying to be a magazine. We're trying to create a moment."[57] Another option Instagram provides the opportunity for companies to reflect a true picture of the brandfrom the perspective of the customers, for instance, using the user-generated contents thought the hashtags encouragement.[59] Other than the filters and hashtags functions, the Instagram's 15-second videos and the recently added ability to send private messages between users have opened new opportunities for brands to connect with customers in a new extent, further promoting effective marketing on Instagram.
SEM is the wider discipline that incorporates SEO. SEM includes both paid search results (using tools like Google Adwords or Bing Ads, formerly known as Microsoft adCenter) and organic search results (SEO). SEM uses paid advertising with AdWords or Bing Ads, pay per click (particularly beneficial for local providers as it enables potential consumers to contact a company directly with one click), article submissions, advertising and making sure SEO has been done. A keyword analysis is performed for both SEO and SEM, but not necessarily at the same time. SEM and SEO both need to be monitored and updated frequently to reflect evolving best practices.
Description meta tags are important because Google might use them as snippets for your pages. Note that we say "might" because Google may choose to use a relevant section of your page's visible text if it does a good job of matching up with a user's query. Adding description meta tags to each of your pages is always a good practice in case Google cannot find a good selection of text to use in the snippet. The Webmaster Central Blog has informative posts on improving snippets with better description meta tags18 and better snippets for your users19. We also have a handy Help Center article on how to create good titles and snippets20.
Blogging website Tumblr first launched ad products on May 29, 2012.[69] Rather than relying on simple banner ads, Tumblr requires advertisers to create a Tumblr blog so the content of those blogs can be featured on the site.[70] In one year, four native ad formats were created on web and mobile, and had more than 100 brands advertising on Tumblr with 500 cumulative sponsored posts.
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[61] Search engines can change their algorithms, impacting a website's placement, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[62] It is considered a wise business practice for website operators to liberate themselves from dependence on search engine traffic.[63] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.
Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic tool: WebSideStory's Hitbox; and (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues and ensuring websites meet W3C code standards. Try to use more than one HTML validator or spider simulator because each one tests, highlights, and reports on slightly different aspects of your website.
Organic search (SEO): When you enter a keyword or phrase into a search engine like Google or Yahoo!, the organic results are displayed in the main body of the page.When your prospects search for information about your products and services, you want to rank highly in search engine results. By “optimizing” your site, you can improve your ranking for important search terms and phrases (“keywords”). You can also improve your rank by getting other important sites to link to yours.
On October 17, 2002, SearchKing filed suit in the United States District Court, Western District of Oklahoma, against the search engine Google. SearchKing's claim was that Google's tactics to prevent spamdexing constituted a tortious interference with contractual relations. On May 27, 2003, the court granted Google's motion to dismiss the complaint because SearchKing "failed to state a claim upon which relief may be granted."[68][69]
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches.[64] In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007.[65] As of 2006, Google had an 85–90% market share in Germany.[66] While there were hundreds of SEO firms in the US at that time, there were only about five in Germany.[66] As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise.[67] That market share is achieved in a number of countries.

Social networking sites such as Facebook, Instagram, Twitter, MySpace etc. have all influenced the buzz of word of mouth marketing. In 1999, Misner said that word-of mouth marketing is, "the world's most effective, yet least understood marketing strategy" (Trusov, Bucklin, & Pauwels, 2009, p. 3).[79] Through the influence of opinion leaders, the increased online "buzz" of "word-of-mouth" marketing that a product, service or companies are experiencing is due to the rise in use of social media and smartphones. Businesses and marketers have noticed that, "a persons behaviour is influenced by many small groups" (Kotler, Burton, Deans, Brown, & Armstrong, 2013, p. 189). These small groups rotate around social networking accounts that are run by influential people (opinion leaders or "thought leaders") who have followers of groups. The types of groups (followers) are called:[80] reference groups (people who know each other either face-to-face or have an indirect influence on a persons attitude or behaviour); membership groups (a person has a direct influence on a person's attitude or behaviour); and aspirational groups (groups which an individual wishes to belong to).
When would this be useful? If your site has a blog with public commenting turned on, links within those comments could pass your reputation to pages that you may not be comfortable vouching for. Blog comment areas on pages are highly susceptible to comment spam. Nofollowing these user-added links ensures that you're not giving your page's hard-earned reputation to a spammy site.
Keep resources crawlable. Blocking page resources can give Google an incomplete picture of your website. This often happens when your robots.txt file is blocking access to some or all of your page resources. If Googlebot doesn't have access to a page's resources, such as CSS, JavaScript, or images, we may not detect that it's built to display and work well on a mobile browser. In other words, we may not detect that the page is "mobile-friendly," and therefore not properly serve it to mobile searchers.
I would focus on building your personal brand online and building a website right away that you can use as a testing ground for learning digital marketing skills. Most digital marketing programs are too theoretical and digital marketing is a fast evolving industries so you need to be constantly learning through active experimentation to become good enough to consistently make money.
While traditional media, like newspapers and television advertising, are largely overshadowed by the rise of social media marketing, there is still a place for traditional marketing. For example, with newspapers, readership over the years has shown a decline. However, readership with newspapers is still fiercely loyal to print-only media. 51% of newspaper readers only read the newspaper in its print form,[91] making well-placed ads valuable.
Blogging website Tumblr first launched ad products on May 29, 2012.[69] Rather than relying on simple banner ads, Tumblr requires advertisers to create a Tumblr blog so the content of those blogs can be featured on the site.[70] In one year, four native ad formats were created on web and mobile, and had more than 100 brands advertising on Tumblr with 500 cumulative sponsored posts.
Social networking sites such as Facebook, Instagram, Twitter, MySpace etc. have all influenced the buzz of word of mouth marketing. In 1999, Misner said that word-of mouth marketing is, "the world's most effective, yet least understood marketing strategy" (Trusov, Bucklin, & Pauwels, 2009, p. 3).[79] Through the influence of opinion leaders, the increased online "buzz" of "word-of-mouth" marketing that a product, service or companies are experiencing is due to the rise in use of social media and smartphones. Businesses and marketers have noticed that, "a persons behaviour is influenced by many small groups" (Kotler, Burton, Deans, Brown, & Armstrong, 2013, p. 189). These small groups rotate around social networking accounts that are run by influential people (opinion leaders or "thought leaders") who have followers of groups. The types of groups (followers) are called:[80] reference groups (people who know each other either face-to-face or have an indirect influence on a persons attitude or behaviour); membership groups (a person has a direct influence on a person's attitude or behaviour); and aspirational groups (groups which an individual wishes to belong to).
Back end tools, including Web analytic tools and HTML validators, provide data on a website and its visitors and allow the success of a website to be measured. They range from simple traffic counters to tools that work with log files and to more sophisticated tools that are based on page tagging (putting JavaScript or an image on a page to track actions). These tools can deliver conversion-related information. There are three major tools used by EBSCO: (a) log file analyzing tool: WebTrends by NetiQ; (b) tag-based analytic tool: WebSideStory's Hitbox; and (c) transaction-based tool: TeaLeaf RealiTea. Validators check the invisible parts of websites, highlighting potential problems and many usability issues and ensuring websites meet W3C code standards. Try to use more than one HTML validator or spider simulator because each one tests, highlights, and reports on slightly different aspects of your website.

Yelp consists of a comprehensive online index of business profiles. Businesses are searchable by location, similar to Yellow Pages. The website is operational in seven different countries, including the United States and Canada. Business account holders are allowed to create, share, and edit business profiles. They may post information such as the business location, contact information, pictures, and service information. The website further allows individuals to write, post reviews about businesses, and rate them on a five-point scale. Messaging and talk features are further made available for general members of the website, serving to guide thoughts and opinions.[49]

Often the line between pay per click advertising and paid inclusion is debatable. Some have lobbied for any paid listings to be labeled as an advertisement, while defenders insist they are not actually ads since the webmasters do not control the content of the listing, its ranking, or even whether it is shown to any users. Another advantage of paid inclusion is that it allows site owners to specify particular schedules for crawling pages. In the general case, one has no control as to when their page will be crawled or added to a search engine index. Paid inclusion proves to be particularly useful for cases where pages are dynamically generated and frequently modified.

On Google+ you can upload and share photos, videos, links, and view all your +1s. Also take advantage of Google+ circles, which allow you to segment your followers into smaller groups, enabling you to share information with some followers while barring others. For example, you might try creating a “super-fan” circle, and share special discounts and exclusive offers only with that group.


Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
The fee structure is both a filter against superfluous submissions and a revenue generator. Typically, the fee covers an annual subscription for one webpage, which will automatically be catalogued on a regular basis. However, some companies are experimenting with non-subscription based fee structures where purchased listings are displayed permanently. A per-click fee may also apply. Each search engine is different. Some sites allow only paid inclusion, although these have had little success. More frequently, many search engines, like Yahoo!,[18] mix paid inclusion (per-page and per-click fee) with results from web crawling. Others, like Google (and as of 2006, Ask.com[19][20]), do not let webmasters pay to be in their search engine listing (advertisements are shown separately and labeled as such).

Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.

×