My webshop runs on React so I'm running Rendertron. While normal visitors receive the React website, the googlebotand lighthouse user agent receive a pre-rendered, static page. I've a hard time wrapping my head around this concept.
While the page content is the same, there is of course a difference in performance between what a real user and what Google receives. And since Google's focus on performance (FCP, LCP, TBT, and CLS) is increasing, I was wondering how they deal with this.
Will Google somehow visit my website with a 'fake' user agent to see the performance for real users? Or is what I specifically serve Google the only thing that counts? Which means I should only optimize the pre-rendered page for the core web vitals and not the real website (I know those core web vitals increase user experience, so the real website should be optimised as well, but I'm purely looking at it from a ranking point of view now).
[link] [comments]
from Search Engine Optimization: The Latest SEO News https://www.reddit.com/r/SEO/comments/o68tbx/core_web_vitals_dynamic_rendering/>
No comments:
Post a Comment