Saturday, September 13, 2008
Create A Robots.txt File And Increase Your Search Engine Rankings
The robots.txt is a simple text file used to tell search engine bots which pages on your web site should be crawled and indexed. Neil Patel wrote a post on the Link Building Blog that on his personal blog he created a robots.txt file so he could remove any junk pages and duplicate content
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment