I am working on a client's site , for which some pages they dont want indexed in Google. However, the site went live and this wasn't the requirement at that time, so the pages are now showing in the search results. I have researched this issue and found two ways for doing this:
1) first method is to use the noindex tag in the head of page which will stop google from indexing it.
2) To disallow it in robots.txt and remove it from sitemap.
Now some experts are saying that you cant do two at once. if you want the google to deindex it, then it has to crawl that page and find the noindex tag listed there. so adding noindex and disallowing in robots.txt is not possible at once.
However, if you disallow it in robots.txt, the crawler will not crawl it but will still have it indexed.
The client wants those pages gone from the search results, what should be done. Some advice needed.
[link] [comments]
from Search Engine Optimization: The Latest SEO News https://www.reddit.com/r/SEO/comments/ai7hpu/about_no_index_and_robotstxt/>
No comments:
Post a Comment