Hi All,
I'm trying to block URLs with parameters from being crawled to avoid wasting crawl budget/avoid duplication. It's becoming a bigger task than I expected with a ton of strange parameters. Does anyone have experience about how to best deal with them? I'm thinking of placing canonicals on parameter pages to the main page, then blocking them in the robots.txt file. Recently I discovered the URL parameter tool in search console and am curious about using that to block parameter URLs as it seems the easiest solution. Any info on which is better?
Also, is there a way to find specific URLs google crawls everyday? I know there is the "Crawl Stats" feature in search console as well but I'd like to get an idea of how many parameter URLs are being crawled vs other, more important pages.
[link] [comments]
from Search Engine Optimization: The Latest SEO News https://www.reddit.com/r/SEO/comments/eiaipa/url_parameters_question/>
No comments:
Post a Comment