Sunday, August 26, 2018

A brief History of Search Engine Optimisation

Search Engine Optimisation
Search Engine Optimisation began back in the mid 90s as the first search engines began to catalogue the websites that were currently available. Initially all that webmasters needed to do was submit the site to various search engines and the search engine would crawl the site and rank it based on the keywords supplied by the webmaster. This obviously had a serious weak spot. Webmasters began to abuse search engines and started Search Engine Optimisation to keyword stuff the meta tags in order to place highly for search terms that were of no relevance to the site.

When the abuse of meta tags became worse, search engines abandoned their consideration of Meta tags and instead developed more complex ranking algorithms, taking into account factors that were more diverse, including:

* Text within the title tag * Domain name * URL directories and file names * HTML tags: headings, bold and emphasized text * Keyword density * Keyword proximity Search Engine Optimisation * Alt attributes for images * Text within NOFRAMES tags

In 1996 if you submitted your site to yahoo and your site was not completely awful then your could be crawled and listed within 72 hours however people within the community had started to try and crack the basic search engine algorithms in order to rank at the top spots for whatever keyword they wished. Several SEOs managed to Search Engine Optimisation decode the 35 parameters in the excite algorithm and therefore could rank number one for absolutely any search term they wished.

By mid 97 some of the main search engines started to use yahoo as a quality assurance check, therefore getting in yahoo became paramount to the success of a website. Unfortunately due to the volume of yahoo submissions only around 5% of the sites were processed.

Later in the year Infoseek introduced a daily refresh system and began a revolution for search engine optimisation, people could submit their site and be indexed and getting results by the end of the day. A person could see in as close to real time as possible the results of their search engine optimisation. This unfortunately also created a huge problem with spam on the search engines. Hotbot and Altavista were rendered nearly useless with the amount of spam that was on their search engines. To make things worse still people copying top ranked pages became a huge problem, many search engines were completely powerless against duplicate content. This was also the year that the first cloaked pages began to appear on the net, causing even more frustration Search Engine Optimisation for the search engines.
For people earning money of affiliate and referral programs this was a profitable time, it was not unheard of for people to make 5000 referrals a day. Dedicated webmasters were starting to earn lots of money.

In 1998 the search engines began to up the ante, they needed some serious ways to reduce the spam and to evolve the search engine algorithms. Papers were written detailing new off site ranking factors including link population, and directory listings. Search engines also started to use multiple algorithms to generate results for different pages, meaning that if someone decoded the algorithm for page 2 results they would not actually know the algorithm for page 1 results. Search engine optimisation companies Search Engine Optimisation tried to counter these new measures by employing dedicated programmers to crack the algorithms.

No comments:

Post a Comment