Why might Googlebot get errors when trying to access my robots.txt file?

8 648
62.7
Опубликовано 26 ноября 2012, 14:14
I'm getting errors from Google Webmaster Tools about the Googlebot crawler being unable to fetch my robots.txt 50% of the time (but I can fetch it with 100% success rate from various other hosts). (On a plain old nginx server and an mit.edu host.)
Yang, Palo Alto, CA

Fetch as Google:
support.google.com/webmasters/...

Have a question? Ask it in our Webmaster Help Forum: groups.google.com/a/googleprod...

Want your question to be answered on a video like this? Follow us on Twitter and look for an announcement when we take new questions: twitter.com/googlewmc

More videos: youtube.com/GoogleWebmasterHel...
Webmaster Central Blog: googlewebmastercentral.blogspo...
Webmaster Central: google.com/webmasters
автотехномузыкадетское