SEO

Dynamic Rendering

Dynamic rendering is a technique where a site serves two versions of each page: a pre-rendered, static HTML snapshot to search engine crawlers, and a full JavaScript single-page app experience to human users. Google publicly endorsed it as a workaround for heavy-JS sites in 2018, then quietly downgraded it to a "legacy workaround" by 2024 as Googlebot's JS rendering matured.

Dynamic rendering is a technique where a site serves two versions of each page: a pre-rendered, static HTML snapshot to search engine crawlers, and a full JavaScript single-page app experience to human users. Google publicly endorsed it as a workaround for heavy-JS sites in 2018, then quietly downgraded it to a "legacy workaround" by 2024 as Googlebot's JS rendering matured.

Why It Matters

JavaScript-heavy sites used to ship empty HTML shells, trusting the crawler to run JS and see the real content. Googlebot eventually got there, but slowly — often days after the initial HTML fetch — and other crawlers (Bing, AI bots, Facebook unfurl) were worse. Dynamic rendering let SEO teams guarantee that crawlers saw fully rendered HTML immediately, without rewriting the app. For e-commerce catalogs, JS dashboards, and SPAs with critical SEO needs, it was a pragmatic bridge. Understanding it still matters because many sites are running on dynamic rendering today and need to migrate.

How It Works

1. Request arrives: The edge or server inspects the user-agent.

2. Crawler detection: If the user-agent matches a known search bot (Googlebot, Bingbot, Twitterbot, LinkedInBot, AI crawlers), the request routes to a pre-render service. Otherwise, it hits the normal SPA.

3. Pre-render service: A headless Chrome (often via prerender.io, Rendertron, Puppeteer, or Playwright) fetches the page, waits for JS to finish, captures the DOM, returns static HTML.

4. Cache: The rendered HTML is cached so subsequent crawler hits don't re-render.

5. Bot sees HTML; user sees SPA: Same URL, two experiences.

Why It's Now a "Legacy Workaround"

Google's 2024 guidance moved dynamic rendering from "recommended" to "a workaround and not a long-term solution." Reasons:

Googlebot renders JS well now: Google runs an up-to-date headless Chrome and processes JS on most pages quickly. The original reason for dynamic rendering is mostly gone for Google.

Maintenance burden: Running a second render pipeline in parallel is ops complexity that usually outgrows the benefit.

Divergence risk: When the pre-render service breaks or gets stale, bots see old content and users see new content — classic cloaking signal.

Not a cloaking penalty, but close: Google explicitly says dynamic rendering is not cloaking, but divergence between bot and user views can trigger manual review.

Modern SSR / SSG are better: Next.js, Nuxt, SvelteKit, Astro, and Remix ship server-rendered or statically generated pages by default. No dynamic rendering needed.

When (If Ever) It Still Makes Sense

Legacy SPAs you can't rewrite: A mature, JS-heavy app with no budget for SSR migration. Dynamic rendering is still viable as a holding pattern.

Non-Google crawlers: AI crawlers, Bing, Baidu, and niche bots still render JS worse than Google. If those matter to your traffic, dynamic rendering can help.

Widgets and embeds: Content that loads via JS after initial HTML — sometimes the only way to expose it to crawlers.

Edge render workarounds: A thin dynamic render at the CDN edge that transforms the SPA to HTML on-the-fly, without a separate pre-render service.

How to Migrate Off Dynamic Rendering

1. Audit what's actually failing in JS: Use Search Console URL Inspection and rendering comparisons. Many "JS SEO problems" are just missing SSR on critical pages.

2. Move critical pages to SSR or SSG: Home, landing, product, article pages first. Keep dashboards and logged-in areas as SPA.

3. Use a modern framework: Next.js / Nuxt / SvelteKit handle hybrid rendering out of the box.

4. Verify parity: After migration, crawl the new site with Screaming Frog and confirm HTML matches what the old pre-render produced.

5. Retire the pre-render service: Only after several weeks of clean Search Console coverage reports.

Common Mistakes

Serving different content intentionally: Dynamic rendering is "same content, different form." Different content is cloaking and gets penalized.

Not updating the pre-render cache: Stale pre-renders feed bots old content, ranking for yesterday's product.

Relying on pre-render for everything: Putting a pre-render in front of a site without SSR strategy means you're always one outage from crawler blindness.

Ignoring non-Googlebot UA lists: Bing, Yandex, Baidu, AI crawlers all have different UAs. Forgetting one cloaks you from that bot.

Using dynamic rendering for new builds: Don't. Use SSR/SSG in a modern framework instead.

Sources: