Wednesday, April 29, 2020

Google is trying to crawl excluded and blocked pages, more than 16 million

My colleagues sent me a very interesting situation:

  1. Google started crawling a very large volume of blocked pages, already found 16 million)) Blocked URL in the robots.txt, meta noindex, and then added X-Robots-Tag noindex
  2. in the server has increased load, almost 1000 times
  3. there are no 503 errors, the server can withstand loading
  4. an XML sitemap, everything is clean
  5. the site is 1 year old
  6. clicks and impressions are growing in Google Search Console, but for now
  7. very strange, but google does not go so far on the pages... and here 27 slashes (/.../.../.../.../... etc) in the link +/- 25 clicks from the main page
  8. in the log file is really a lot of Googlebot - 500 thousand per day
  9. the backlinks are also clean, no spam

P.S. Screenshots in comments

What do you think?

submitted by /u/Andrew-Chornyy
[link] [comments]

from Search Engine Optimization: The Latest SEO News https://www.reddit.com/r/SEO/comments/ga5k4z/google_is_trying_to_crawl_excluded_and_blocked/>

No comments:

Post a Comment