What should the robot.txt say?
That depends on what you want it to do.
Most people want robots to visit everything in their website. If this is the case with you, and you want the robot to index all parts of your site, there are three options to let the robots know that they are welcome.
1) Do not have a robot.txt file
If your website does not have a robot.txt file then this is what happens -
A robot comes to visit. It looks for the robot.txt file. It does not find it because it isn't there. The robot then feels free to visit all your web pages and content because this is what it is programmed to do in this situation.
2) Make an empty file and call it robots.txt
If your website has a robot.txt file that has nothing in it then this is what happens -
A robot comes to visit. It looks for the robot.txt file. It finds the file and reads it. There is nothing to read, so the robot then feels free to visit all your web pages and content because this is what it is programmed to do in this situation.
3) Make a file called robots.txt and write the following two lines in it... (these are "instructions" for the robot to follow)
User-agent: *
Disallow:
If your website has a robot.txt with these instructions in it then this is what happens -
A robot comes to visit. It looks for the robot.txt file. It finds the file and reads it. It reads the first line. Then it reads the second line. The robot then feels free to visit all your web pages and content because this is what it is what you told it to do.
What do the robot instructions mean?
Here is an explanation of what the different words mean in a robot.txt file
User-agent:
The "User-agent" part is there to specify directions to a specific robot if needed. There are two ways to use this in your file.
If you want to tell all robots the same thing you put a " * " after the "User-agent" It would look like this...
User-agent: * (This line is saying "these directions apply to all robots")
If you want to tell a specific robot something (in this example Googlebot) it would look like this...
User-agent: Googlebot (this line is saying "these directions apply to just Googlebot")
Disallow:
The "Disallow" part is there to tell the robots what folders they should not look at.
This means that if, for example you do not want search engines to index the photos on your site then you can place those photos into one folder and exclude it.
Lets say that you have put all these photos into a folder called "photos". Now you want to tell search engines not to index that folder.
Here is what your robot.txt file should look like:
User-agent: *
Disallow: /photos
The above two lines of text in your robots.txt file would keep robots from visiting your photos folder. The "User-agent *" part is saying "this applies to all robots". The "Disallow: /photos" part is saying "don't visit or index my photos folder".
Googlebot specific instructions
The robot that Google uses to index their search engine is called Googlebot. It understands a few more instructions than other robots. The instructions it follows are well defined in the Google help pages (see resources below).
In addition to the "User-name" and "Disallow" Googlebot also uses the...
Allow:
The "Allow:" instructions lets you tell a robot that it is okay to see a file in a folder that has been "Disallowed" by other instructions.
To illustrate this, let's take the above example of telling the robot not to visit or index your photos. We put all the photos into one folder called "photos" and we made a robot.txt file that looked like this...
User-agent: *
Disallow: /photos
Now let's say there was a photo called mycar.jpg in that folder that you want Googlebot to index. With the Allow: instruction, we can tell Googlebot to do so, it would look like this...
User-agent: *
Disallow: /photos
Allow: /photos/mycar.jpg
Monday, December 22, 2008
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment