gototop
image
image
image
image
image image image image
Home SEO Search engine optimization - historical development
Serbian (Latin)English (United Kingdom)

Website optimization - Historical development of SEO


Website search engine optimization is a term that is introduced sometime in the middle of 1990. Webmasters started with the website optimization parallel with first search engines who were operating on the cataloging of early web. In the beginning everything that webmasters needed to do was to deliver the address of the page or URL to different search engines and they would send a “spider” to “crawl” the page and extracts the links towards other internet pages and to retract all information that they found on the page for their indexing. That kind of a process includes a spider that crawls the web page and plants it on the server of the search engine itself while the other program, known as indexer, extracts different information about the web page such as words that it is consisted of, their location as well as everything that is significant for specific words. All links that a web page has are also extracted and are placed in a plan for later crawl.

Owners of certain websites started to recognize the value of web presentation that is visible and highly positioned in result lists of the search engines thus creating the opportunity for work to those that do so called “white hat” and “black hat” SEO methods. According to one of the leading SEO analytics who is responsible for “Search Engine Land” project the “search engine optimization” term was probably introduced for the first time somewhere around the year 1997. It is documented that the first usage of this term was done by John Audette and his company Multimedia Marketing Group which is registered on the MMG web site in august 1997. The first web site registered in American agency for protection of wrights that included the term “search engine optimization” is the website of Bruce Clay dating from March of 1997.

First versions of search algorithm relied heavily on the information that was delivered by webmasters and they included keywords in meta tags or data index in the systems like ALIWEB. Meta tags give data about the content of every page. The indexing of web pages using meta data soon showed that it was very unreliable because there was a chance that the choice of the keywords in meta tags by webmasters could be manipulated and that the tags that are stored on a web page don’t represent the content of the page itself. Wrong, incomplete and inconsistent data in meta tags enabled ranking of pages that had nothing to do with the content. Webmasters also manipulated with number of attributes in the HTML source of a page; they did that solely to rank better on search engines.

First versions of search engines were meta abused and manipulated because they depended too much on factors such as keyword density that was controlled exclusively by webmasters. In order to deliver as accurate results as they can, search engines had to have their pages show only the most relevant results of the search and not the pages that had nothing to do with the specified search which are loaded with great number of keywords. Since the success and popularity of search engines depends on its ability to display accurate results for any search, displaying the search with wrong and inaccurate results would end up to visitor leaving and focusing on a different source of information. Search engines responded to that challenge by creating more complex algorithms for ranking that included additional factors that webmasters could not manipulate.

Graduates from the Stamford University, Larry Page and Sergey Brin have developed a backrub, a search engine that relied on mathematical algorithm in order to value web pages. The number was given by the algorithm PageRank, is a function of the amount and strength of incoming links. PageRank estimates the probability that the certain page will be available to the user that randomly searches the internet and also it follows the links that go from one page to another. Basically it means that some links are stronger than the others because the page with higher PageRank ha s more chance of being visited by a random user.

Page and Brin founded Google in 1998. Google got its loyal followers among the growing internet population very quickly because they love the simplicity in the design. Beside the on-page factors (such as keyword density, meta tags, headlines, titles, links and structure of the web site), Google calculated the off-page factors (such as PageRank and hyperlink analysis) in order to avoid manipulations that other search engines were subdued to. Although PageRank was harder to conquer, webmasters developed a system called link-building and different schemes that already influenced the Inktomi search system and they showed very adaptable with manipulating with PageRank. Some of these schemes such as link farms influenced the creation of thousands of web sites shat server solely for link spam.

Until 2004 Search engines introduced a wide array of hidden factors in their search algorithms in order to lower the manipulation of links. Google says that they use over 200 signals when they rank a page or a website. Leading internet search engines beside Google such as Bing and Yahoo also do not reveal their algorithms. Since then some of the most significant SEO specialists study different approaches to the optimization and they publish them on forums and blogs. The term Google optimization has become a synonym for the entire SEO system.

In 2005 Google started with the personalization of the search for each user. Depending on the history of earlier searches Google started to show results for logged users. In 2008 Bruce Clay stated: “ranking is dead” because of the personalized searches. The ranks of the web sites became insignificant because, in the end, the positioning would be different for each user and every search.

In the year 2007 Google announced a campaign against payed links that influence PageRank. On the 15th of June 2009 they announced that measures were taken for softening the effects of manipulation of PageRank by using the so called “No Follow” attribute  in links. Matt Cutts, renowned software engineer at Google announced that Google Bot will no longer treat no follow links in the same way. That was done in order to prevent the SEO specialists to use no follow to modify PageRank. As a result of this change further usage of no follow attributes leads to extinction of PageRank. In order to avoid that , SEO engineers have developed different methods that replaced no follow tags with masked Java script code (so called obfuscated code) thus enabling further influence of PageRank.

In December 2009 Google announced they will begin to use the history of all users in order to form the search result list.
Real-time search was introduced in the end of 2009 in an attempt to make search results more relevant and timely. Although website administrators spent months, sometimes even years optimizing a web site for better ranking, enormous growth in the popularity of social networks contributed for the search engines to change their algorithms in order to implement fresh content as fast as possible.


Featured Projects


>> Branding process - Fertico company
>> Web design - Green Line
>> Catalog design - Ergomade
>> Business card design - Feniks NLM
>> Logo design - Sound Control
>> Video editing - Polaris Media


Shindiri Clients


As web designers and developers we all strive to be the best. But the bottom line is that you are only as good as your client list is.

Tami ResidenceCentar Cad
James L. WilliamsDesibe
Green LineCrowwn Bet
G&T FruitsCargo Planet
Vulkan ADGoldys
ErgomadeSCGM
OriflameFeniks NLM
Red BullZenski Prostor
House and Property DirectorateUnicef
Aleksandra DedicReformisticka stranka

MaplinNisville Jazz Festival