Thursday, January 28, 2021

How do I best prevent duplicate content/SEO dilution for replica websites with different top-level domains?

Hey guys, I have read a bunch on this and am still having a tough time figuring out the best way to solve my problem. If anyone could help I would appreciate it a ton!!

My problem is that I have a .com website, but for various reasons we have to make a .us website as well. However, it is just a formality, the .us domain doesn’t have to rank on google. So I think I have two options:

  1. If I want to keep ALL of the SEO credit (or as much as possible) in the .com website and I don’t care about the .us domain being searchable on Google, is it better to use a) robots.txt to prevent crawling or b) a no-index meta tag to prevent indexing?

  2. If I still want my .us domain to be searchable on google, but give as much SEO credit back to the .com domain as possible, is it better to use a) canonical tags or b) 301 redirects?

Ideally I would want the .us domain to be searchable on google, while passing ALL of the SEO credit to the .com domain, but if that’s not possible and it hurts the SEO rankings of the .com to a noticeable extent then I don’t mind it being unsearchable.

I hope I am not too far off base in my thinking, but I have a very small team and am solely responsible for this - so any help would be awesome. Thanks!

submitted by /u/Such_Ask3926
[link] [comments]

from Search Engine Optimization: The Latest SEO News https://www.reddit.com/r/SEO/comments/l76cv4/how_do_i_best_prevent_duplicate_contentseo/>

No comments:

Post a Comment