Hi there,
To check indexation I normally crawl the site, then check search console and the sitemap - compare all the status codes, vlookup them against each other to make sure they match etc.
I'm used to working with lead-gen websites - just started a new job on a large ecommerce site
- sitemap, site crawl and S.C. coverage report all show approx 1.8k URLs indexed
- site: search shows 320 URLs indexed (it says "About 2,030 results (0.44 seconds)" - but upon scraping the URLs - there is only 320)
Is there a specific reason why Google is not showing all of the URLs on a site: search?
Thanks v much
[link] [comments]
from Search Engine Optimization: The Latest SEO News https://www.reddit.com/r/SEO/comments/qi8080/18k_urls_indexed_in_search_console_320_urls/>
No comments:
Post a Comment