Why might Googlebot get errors when trying to access my robots.txt file?
I'm getting errors from Google Webmaster Tools about the Googlebot crawler being unable to fetch my robots.txt 50% of the time (but I can fetch it with 100% success rate from various other hosts). (On a plain old nginx server and an mit.edu host.)
Yang, Palo Alto, CA
Fetch as Google:
http://support.google.com/webmasters/bin/answer.py?answer=158587
Have a question? Ask it in our Webmaster Help Forum: http://groups.google.com/a/googleproductforums.com/forum/#!forum/webmasters
Want your question to be answered on a video like this? Follow us on Twitter and look for an announcement when we take new questions: http://twitter.com/googlewmc
More videos: http://www.youtube.com/GoogleWebmasterHelp
Webmaster Central Blog: http://googlewebmastercentral.blogspot.com/
Webmaster Central: http://www.google.com/webmasters/