A few tips I put together while re-creating the robots.txt file on my Linux web server. The robots.txt is used to provide crawling instructions to web robots using the Robots Exclusion Protocol. When a web robots visits your site it will check this file, robots.txt, to discover any directories or pages you want to exclude from the web robot listing on the search engine.
Read more »Robots.txt Tips For Deailing With Bots
http://beginlinux.com –