Why might Googlebot get errors when trying to access my robots.txt file?

Subscribers:
751,000
Published on ● Video Link: https://www.youtube.com/watch?v=XMyz7-D96vQ



Duration: 1:33
8,386 views
46


I'm getting errors from Google Webmaster Tools about the Googlebot crawler being unable to fetch my robots.txt 50% of the time (but I can fetch it with 100% success rate from various other hosts). (On a plain old nginx server and an mit.edu host.)
Yang, Palo Alto, CA

Fetch as Google:
http://support.google.com/webmasters/bin/answer.py?answer=158587

Have a question? Ask it in our Webmaster Help Forum: http://groups.google.com/a/googleproductforums.com/forum/#!forum/webmasters

Want your question to be answered on a video like this? Follow us on Twitter and look for an announcement when we take new questions: http://twitter.com/googlewmc

More videos: http://www.youtube.com/GoogleWebmasterHelp
Webmaster Central Blog: http://googlewebmastercentral.blogspot.com/
Webmaster Central: http://www.google.com/webmasters/







Tags:
google
seo
webmaster tools
robots.txt
googlebot
fetch as google