UNSOLVED Unreachable: robots.txt
I'm receiving the following error when trying to index my URL:
Failed: Robots.txt unreachable
Before we crawled the pages of your site, we tried to check your robots.txt file to ensure we didn't crawl any pages that you had roboted out. However, your server returned a 5xx (unreachable) error when we tried to retrieve your robots.txt file. To make sure we didn't crawl any pages listed in that file, we postponed our crawl.
Your hosting provider may be blocking Googlebot, or there may be a problem with the configuration of their firewall.
Note: If the contents of the robots.txt file is different in your browser than what Google sees, work with your hosting company to remove any server rules that might show different robots.txt content to different user agents*.
My robots.txt file is good and haven't had any issues until now.
Any idea on why this is happening?
@s-santiago could you please share the url of your site so I can check that for you?
@joseph-benguira Still having the same issue as of today (Nov. 8th). Any chance Googlebot might be blocked in someway on the hosting side or have some type of firewall configuration problem?
The problem is that robots.txt is accessible, just not by googlebot for some reason
@s-santiago really strange, I never heard about similar issue before.
Googlebot is not blocked by appdrag of course