Hi all,
I'm working my way through issues raised in a SEMRush site audit and one of the issues I get is lots of warnings regarding "blocked internal resources in robots.txt ". It only flags up one script regarding Autoptimize lazyload impacting multiple pages:
/wp-content/plugins/autoptimize/classes/external/js/lazysizes.min.js?ao_version=2.6.1
I've amended the robots.txt to remove all disallows to initially see what happened and carried out a crawl not blocked by robots.txt but the warnings still comes up so doesn't seem to be the robots.txt blocking it.
I don't have much SEO knowledge but is this something I should be looking to fix? If so, any suggestions?
Thanks.
[link] [comments]
from Search Engine Optimization: The Latest SEO News https://www.reddit.com/r/SEO/comments/etyqql/semrush_site_audit_blocked_internal_resources_re/>
No comments:
Post a Comment