My colleagues sent me a very interesting situation:
- Google started crawling a very large volume of blocked pages, already found 16 million)) Blocked URL in the robots.txt, meta noindex, and then added X-Robots-Tag noindex
- in the server has increased load, almost 1000 times
- there are no 503 errors, the server can withstand loading
- an XML sitemap, everything is clean
- the site is 1 year old
- clicks and impressions are growing in Google Search Console, but for now
- very strange, but google does not go so far on the pages... and here 27 slashes (/.../.../.../.../... etc) in the link +/- 25 clicks from the main page
- in the log file is really a lot of Googlebot - 500 thousand per day
- the backlinks are also clean, no spam
P.S. Screenshots in comments
What do you think?
[link] [comments]
from Search Engine Optimization: The Latest SEO News https://www.reddit.com/r/SEO/comments/ga5k4z/google_is_trying_to_crawl_excluded_and_blocked/>
No comments:
Post a Comment