Search engine optimization is a relatively new term. It would have to be. Search engines have only been around for a little over twenty years. And the Internet that those first search engines were mapping was a much smaller and less complicated place. An individual could, and regularly did, have an impact on the Internet as a whole because there were so few people out there using it.
Search engine optimization as a concept didn’t exist at first because search engines themselves barely existed, and those that did were using techniques so different and so simple compared to modern-day search engines, that the very idea of search engine optimization wouldn’t have made much sense. So today, let’s take a quick look at how we got from there to here and try to learn something about how modern search engine optimization evolved.
Who needs search bots when you have crawlers?
Early search engines didn’t use bots like your modern day Google. Well, actually, they did. They were just a lot simpler, and they didn’t call them search bots. They called them crawlers. These automated programs “crawled” the web and indexed it. Early search engines like WebCrawler.com, Excite.com, and Lycos.com were powered by these crawlers.
Yahoo! set itself apart by guaranteeing that their results were powered mainly by direct, human evaluation of web sites. Yahoo! results prioritized sites that had been visited by a person and had been verified as having good, relevant content. Back when search engine crawlers were simple enough to be fooled by simple HTML tricks like keyword stuffed META tags, this was an essential evolution. Yahoo! supplemented their personally-reviewed search results with crawled sites when better data wasn’t available, but the majority of what came up in a Yahoo! search was verified, solid content. Yahoo! was able to deliver a more useful search of the Internet because they paid attention to what users wanted.
But in 1997, Google changed the game and the way that the entire Internet functioned by introducing a new way to evaluate the web…
Why PageRank rules the world
Google had a new idea. They knew that the Internet was expanding far too rapidly and would soon be far too large to be searched and indexed by real people. They also knew that crawlers were woefully inadequate when it came to determining which sites had real value and which were just stuffed full of keywords to get attention. So they came up with a new system of ranking sites called PageRank. PageRank has evolved over the years, but it has always come down to two main things: content and backlinks. The theory is simple. If a site has quality content, then a lot of people are going to link to it. The best sites with the best content will be linked to by the most other sites.
This basic idea has evolved constantly over the past 17 years, but the core remains the same. Google has developed sophisticated bots over the years that can judge the quality of content almost as well as a human based on keyword placement and density, and many would argue that the automated bots already function better than the human-reviewed web of Yahoo! No one knows exactly how Google applies PageRank, but you can be sure that the keys to success are consistently adding value to the web. Modern search engine optimization is based on making sure that your site meets the theoretical requirements for a high PageRank.
But not everyone has the time or the expertise to make sure that they can frequently and effectively add value to the web. Many small and large business owners are too busy with real-world issues to pay attention to their digital identity on a regular basis. Luckily, there are plenty of trained professionals ready to make sure that your search engine optimization strategy will be in line with current trends and thinking.