I fancy myself as an SEO expert... or at least I play one at my job. That said, I am having the WORST time trying to figure out this problem:
My site won't index. Google REFUSES to read the robots.txt. I've tried every different version: the standard
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
The extreme Neil Patel example:
User-agent: *
Disallow:
My own "fuck it I just want to go bed" version:
User-agent: *
Allow: /
No help.
Before you tell me the obvious solutions, this is what I've done (Lodestar theme, WP CMS):
- Yes, I checked "Settings" under "Reading."
- All of those robots.txt files pass the Search Console tester
- That same Search Console says my pages are all noindex
- I have the See Robots Chrome extension and it's green everywhere
- Screaming Frog crawl won't get past one page
- At the advice of a co-worker, I tried the SEO Minion Chrome extension and it fires 200 responses
- Bing Webmaster tools can crawl the sitemap and it actually presents a No. 1 search result for my field... except the SERP says robots.txt can't read my shit
- I even thought for a minute that Jetpack and Yoast might be conflicting, but I turned off the SEO capabilities of Jetpack, so it should have no impact on robots.txt, right?
At this point, I'm kind of lost and questioning my entire knowledge base as an SEO professional. Granted, I'm not a developer, but this shouldn't be that hard...
[link] [comments]
from Search Engine Optimization: The Latest SEO News https://www.reddit.com/r/SEO/comments/fqex4t/a_baffled_seo_professional/>
No comments:
Post a Comment