Optimizing the page performance via edge workers but not sure if it impacts SEO? Read the answer here.
There are cases when we need to render content while the page is still loading. With edge workers, it becomes an easy task while improving TTFB, FCP, & CLS too. While doing this may be good, it is important to consider SEO impact of using edge workers to render dynamic content on the webpage.
Well. SEO ranking might be impacted badly or remains same depending upon the context and result produced via edge worker. If the dynamic content rendered is different from the indexed content or which was pushed to GoogleBot, it will impact SEO badly. If the content or DOM elements rendered dynamically via edge workers results in similar view for the users as well as for the GoogleBot, it does not impact SEO badly.
Changing most of the content for users while showing a different content to GoogleBot is considered cloaking. Cloaking is a bad SEO practice and deserves penalty. However, if you’re doing the changes right way, it is not an issue.
We cleared a similar scenario in case of captchas impact on SEO too.
If the payload via edge worker is huge, it increases cumulative layout shift (CLS) and largest contentful paint (FCP). This leads to failing core web vitals test and hence, low rankings in SERPs.
Using edge worker to populate or rearrange content while page rendering doesn’t penalize website in search results, however, if the rendered output is visually different for search bot i-e: GoogleBot/BingBot vs. user, it is considered cloaking which is considered severely bad SEO practice.
Just make sure the search bot and users views the same or very similar content with exception of initial captchas or authorization blocks.
An SO community member asked
Below is my answer given to him.
Okay. So from what I understand, you want to do this:
- Besides page title and description in the head, you add the temporary element with some static content to get indexed by Google for search results.
- Content get indexed, GoogleBot crawls the page, while loading the page, that temporary element is removed and replaced with JS-driven SAME content enabling the user to interact with it.
Answer: This is perfectly okay to do it as long as the rendered content remains SAME or very similar. Since the view for GoogleBot & User is same. This is not cloaking. Hence, no penalty. GoogleBot renders JS and sees the output very well. Populating/rearranging/settling the content when page is loading while the complete rendered output is still similar, this technique is perfectly okay. No bad impact on SEO for THIS reason.
However, if you change most of the output/visual content for GoogleBot vs. Users (before or while page loading via JS or dynamically), this is considered cloaking and deserves a penalty.
Finally, if you don’t care about ‘rankings’ and you need just ‘indexing’, you can setup temporary elements (for indexing) and remove them via JS.
Coming towards other points related to the question context.
- Since the payload is huge and takes time for the page to complete its cumulative layout shift as well as largest meaningful paint, so, that page will fail Core Web Vitals assessment. For this reason, it will definitely impact rankings in SERPS – but still remain indexed somewhere without penalty.
- Edge workers definitely improve the performance, it is better to use that instead of Server-side work. This will possibly reduce TTFB, First contenful paint, and time to interact. – Again, it depends how much it is improved.
Answer by Aqsa J. on Webmasters | Stackoverflow
Our standardized editorial process ensures right, timely, and healthy updates to our content. Your honest opinion drives significant improvement to our content. We appreciate you are taking time to share that.
100K+ readers have already joined the lists of their choice. Will you? Just now, maybe?Pick List(s) to Subscribe