As titled.
I tried submitting some new pages for indexing a few days ago and they failed for some reason. After some digging, it looks like Google is having a hard time fetching robots.txt (I get intermittent errors). Also, when I try to submit pages to be fetched, they fail sometimes and go through other times.
Seems like this all started happening a few days ago as that's when google reported seeing robots fetch errors, but I can't think of anything I did at that time that could possibly have affected this.
The robots file dead simple and has never been edited. The site itself is, as you can see, super basic wordpress setup.
Anyone ever see this before or have any ideas of where to look next?
Screenshots of errors in console: https://imgur.com/a/UP8veiV
[link] [comments]
from Search Engine Optimization: The Latest SEO News https://www.reddit.com/r/SEO/comments/9rojuw/google_fails_to_fetch_robotstxt_or_other_content/>
No comments:
Post a Comment