Sunday, August 4, 2019

Use a list of Search Bot user-agent-strings to customise page display for bots? Is this a good idea?

Hi

We're building a site where content is filtered by location.

We'd like all content to be indexed by search bots. Our sitemap xml has links to all content pages.

Should we also keep a list of search bot user-agent-strings and customise the display of content to remove all location based display restrictions (just for the bots)? -Or, is it enough to just have all the content linked via the sitemap?

Does Google, etc. condone or penalise this practise? (customising the content just for bots)

Does this difference in discover ability (sitemap vs crawling) have a negative impact on ranking or search performance?

Thank you r/SEO!

submitted by /u/ElectroSpork9000
[link] [comments]

from Search Engine Optimization: The Latest SEO News https://www.reddit.com/r/SEO/comments/cm4985/use_a_list_of_search_bot_useragentstrings_to/>

No comments:

Post a Comment