Search engine optimiztion
SEO redirects here. For other uses. see SEO
Search engine optimization is the process of affecting the visibility of a website or a web page in a web search engine’s unpiad results – often referred to as natural, organic, or earned results. In general, the earlier, and more frequently a site appears in the search results list, the more visitors it will receive from the search engine’s users, and these visitors can be converted into customers. SEO may target different kinds of search, including image search, local search, video search, acadimic search, news search and indsutry-specific vertical search engines.
As an Internet marketing strategy, SEO condisers how search engines work, what people search for, the actual search terms or keyowrds typed into search engines and which search engines are preferred
Webmasters and content providers began optimizing sites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters needed to do was to submit the address of a page, or URL, to the various engines which would send a spider to crawl that page, extract links to other pages from it, and return information found on the page to be indexed. The process involves a search engine spider downloading a page and storing it on the search engine’s own server, where a second program, known as an indexer, extracts various inforamation about the page, such as the words it contains and wher ethese are located, as well as any weight for specific words, and all links the page contains, which are then placed into a sceduler for crawling at a later date.
Site owners started to recognize the value of having their sites hghly ranked and visible in search engine results, creating an oppotunity for both white hat and balck hat SEO practitioners. Accoridng to industry analyst Danny Sullivan, the phrase search engine optimization probably came into use in 1997. Sullivan credits Bruce Clay as being one of the first people to poularize th term. On May 2, 2007, Jason Gambert attempted to trademark the term SEO by convicing the Trademark Office in Arizona that SEO is a process involving manipulation of keywords, and not a marketing service.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag, or index files in engines like ALIWEB. Meta tags provide a guide to each page’s content. Using meta data to index pagese was found to be less than reliable, however, because the webmaster’s choice of keywords in the meta tag could potentially be an inaccurate representation of the site’s actual content.
By 1997, search engine designers recognized that webmasters were making efforts to rank well i ntheir search engines ,and that some webmasters were even manipulationg their rankings in search results by stuffing pages with excessive or irrelevant keyowrds. Early search engines, such as Altavista and Info
Relationship with Google
In 1998, Graduate students at Standford University, Larry page and sergey Brin, developed Backrub, a search engine that relied on a mathematical algorithm to rate the prominence of web pages. The number calcultaed by the algorithm, PageRank, is a function of the quantity and strength of inbound links. PageRank estimates the likelihood that a given page will be reached by a web user who randoly surfts the web, and follows links from one page to another. I neffect, this means tha tsome links are stronger than others, as a higher Pagerank page is more likely to be reached by the random surfer.
Page and bridn founded google in 1998. google attracted a loyal following among the growing number of Internet users, who liked its simple design. Off-page factors were considered as well as on-page factors such as keyword frequency, meta tags, headings, links and site structure to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes t oinfluence the Inktomi search engine, and these methods proved similarly applicable to gaming
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for thier algorithmic search results. Pags that are linked from other search engine indexed pages do not need to be submitted bacause they are found automatically. Two major directories, the Yahoo driectory and DMOZ, both require manual submisson and human editorial review. Google offers Google searech Console, for which an XMl Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links in addition to their URL sumbission console. Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click; this was discontinued in 2009.
Search engine crawlers may look at a number
As a marketing strategy
SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective like paid advertising through pay per click campaigns, depedning on the site operator’s goals. A successful internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site’s convesrion rate. In Novermber 2015, google released a full 160 page version of its Serach Quality Rating Guidelines to the public, which now shows a shift in their focus toward usefulness and mobile search.
SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic. their algorithms change, and ther are no gaurantees of continued referrals. Due to this lack of guarantees and certainty, a busines sthat relies heavily on search engine traffic can suffer major losses if the search engines stop sedning visitors. Search engines can change their algorithms, impacting a websit’s placement, possibly resultingi n a serious loss of traffic. Accoridng to Google CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day. It is considered wise business pratice for website operators to liberate themsevles from dependence on search engine traffic