How Dynamic Content Rendering via Edge Workers Impact SEO
Optimizing the page performance via edge workers but not sure if it impacts SEO? Read the answer here.
In adherence to our rigorous editorial policy, this article's content has undergone careful testing for accuracy and trustworthiness and hence, this content is marked source of information. View editorial history of this content.
There are cases when we need to render content while the page is still loading. With edge workers, it becomes an easy task while improving TTFB, FCP, & CLS too. While doing this may be good, it is important to consider SEO impact of using edge workers to render dynamic content on the webpage.
Well. SEO ranking might be impacted badly or remains same depending upon the context and result produced via edge worker. If the dynamic content rendered is different from the indexed content or which was pushed to GoogleBot, it will impact SEO badly. If the content or DOM elements rendered dynamically via edge workers results in similar view for the users as well as for the GoogleBot, it does not impact SEO badly.
Changing most of the content for users while showing a different content to GoogleBot is considered cloaking. Cloaking is a bad SEO practice and deserves penalty. However, if you’re doing the changes right way, it is not an issue.
We cleared a similar scenario in case of captchas impact on SEO too.
Another thing. It may have an indirect impact due to other factors.
If the payload via edge worker is huge, it increases cumulative layout shift (CLS) and largest contentful paint (FCP). This leads to failing core web vitals test and hence, low rankings in SERPs.
Using edge worker to populate or rearrange content while page rendering doesn’t penalize website in search results, however, if the rendered output is visually different for search bot i-e: GoogleBot/BingBot vs. user, it is considered cloaking which is considered severely bad SEO practice.
Just make sure the search bot and users views the same or very similar content with exception of initial captchas or authorization blocks.
An example of content rendered via edge worker and analysis of its impact on SEO.
An SO community member asked this question. He wanted to load a book on the webpage driven by JS edge worker. His purpose was to improve page performance since he was doing server-side work to load book before loading the page. Loading book style this way increased TTFB and hence speed index. Therefore, he asked the impact of loading book using edge worker technique on SEO.
Below is my answer given to him.
Okay. So from what I understand, you want to do this:
- Besides page title and description in the head, you add the temporary element with some static content to get indexed by Google for search results.
- Content get indexed, GoogleBot crawls the page, while loading the page, that temporary element is removed and replaced with JS-driven SAME content enabling the user to interact with it.
Answer: This is perfectly okay to do it as long as the rendered content remains SAME or very similar. Since the view for GoogleBot & User is same. This is not cloaking. Hence, no penalty. GoogleBot renders JS and sees the output very well. Populating/rearranging/settling the content when page is loading while the complete rendered output is still similar, this technique is perfectly okay. No bad impact on SEO for THIS reason.
However, if you change most of the output/visual content for GoogleBot vs. Users (before or while page loading via JS or dynamically), this is considered cloaking and deserves a penalty.
Finally, if you don’t care about ‘rankings’ and you need just ‘indexing’, you can setup temporary elements (for indexing) and remove them via JS.
Coming towards other points related to the question context.
Answer by Aqsa J. on Webmasters | Stackoverflow
- Since the payload is huge and takes time for the page to complete its cumulative layout shift as well as largest meaningful paint, so, that page will fail Core Web Vitals assessment. For this reason, it will definitely impact rankings in SERPS – but still remain indexed somewhere without penalty.
- Edge workers definitely improve the performance, it is better to use that instead of Server-side work. This will possibly reduce TTFB, First contenful paint, and time to interact. – Again, it depends how much it is improved.
Digital Setups has enforced a strict sourcing policy. Every content piece published on our website is passed through strict editorial review for contextual correctness, communication ethics, and programmatic tests wherever required. Our team research solutions from only credible, authentic, and trustworthy sources. Learn more about our editorial process.
Based on our editorial policy, we update our content time to time to ensure its usefulness, reliability, and validity.
Our standardized editorial process ensures right, timely, and usefulness updates to our content. Your honest opinion drives significant improvement to our content. We appreciate you are taking time to share that.
Readers who read this also found these helpful:
- Facebook Meta for Business Suite Hacked. Can you take it back?
- Block Editor: Pattern vs. Variation vs. Style vs. Transformation
- Using ‘Advertisement’ or Similar Term above AdSense Ads
- WP: Add Support for Custom Logo in Theme
- Search Console: Fix Not Found (404) Errors in Page Indexing
- Custom Rewrite Rules vs. REST Routes in WordPress
- Search Console: Fix “Alternate page with proper canonical tag” Error in Page Indexing
- Fixed: WordPress wp-json 500 Server Error
- Apache Access Log Entry: CONNECT Method
- Unlock new cross-device capabilities & More with Google Signals