I found a bunch of completely duplicate sites of one of my clients in GSC coverage, which was showing upas
"Duplicate, Google chose different canonical than user"
As I can't export this data from GSC, there are 1000s of pages I would need to manually check.
Is there any tools that will do this at bulk? I tried siteliner but with the free version it picked up 0, so bit hesitant to upgrade to premium if it won't find any.
[link] [comments]
from Search Engine Optimization: The Latest SEO News https://www.reddit.com/r/SEO/comments/nut4sf/tools_to_find_duplicate_domains/>
No comments:
Post a Comment