Sunday, December 14, 2008

How Search Engines Work.

There are two types of search engines online. Crawler based and human edited. Both are basically the same thing but it’s the way in which your are indexed or accepted that differs. This can get pretty technical very quickly so I am going to spare the details and give you the information that you need to get by.

Crawler Based
This is how a majority of the search engines work nowadays. They have what is called a spider or bot that they send all over the web to find new and existing web pages so they can keep their records up to date. This process is automated to save time and get the most information back from each web crawl. When a spider is crawling your website, it is looking for changes that you have made to keep your content fresh, this is sometimes referred to as the ‘freshness factor’. Some of the important elements the spiders look for on your site that help to determine where you are placed in the rankings are the title, content and ease of use. Every search engine runs a little different. Google is probably the most complex when it comes to crawling your website

Human Edited
A human edited directory is just what is sounds like. You submit your site to them, they manually review it and other bits of information you give them, including a title, description and a category and they decide whether your site is worthy of being in their index. This process is time consuming so getting listed in a human edited engine take a lot longer and carries a higher likeliness of being denied inclusion. There are hundreds, possibly thousands of human edited search engines out there, one of the more popular, highly relevant ones is called dmoz.org or open directory project.

Both crawler based and human edited search engines have one thing in common. They are both on a quest to give their visitors the best, most relevant search results for the keyword or phrase they are searching. The way they crawl or the criteria they look for is ever changing.

No comments: