The Technical SEO Checklist for JavaScript Framework Sites (React, Next.js, Nuxt, SvelteKit) | AuditMySite
JavaScript Frameworks and Google: The 2026 Reality
Google can render JavaScript. That's been true since 2019. But "can render" and "will render efficiently, consistently, and in a timely manner" are very different statements. The reality in 2026 is nuanced: Googlebot's rendering queue still introduces a delay of hours to days between crawling and rendering JavaScript-heavy pages. During that delay, your content is effectively invisible.
A study by Onely (updated January 2026) found that 12% of JavaScript-rendered content is never indexed due to rendering errors, timeouts, or resource limits. For sites with thousands of pages, that's hundreds of pages Google simply never sees.
This checklist covers every technical SEO consideration for sites built on React, Next.js, Nuxt, SvelteKit, Angular, and similar frameworks.
Rendering Strategy: The Single Most Important Decision
Your Options (Ranked by SEO Safety)
- Static Site Generation (SSG): Pages pre-rendered at build time. Perfect SEO — Google gets plain HTML. Best for: content sites, blogs, documentation. Downside: build times grow with page count.
- Incremental Static Regeneration (ISR): Pages statically generated but revalidated at intervals. Nearly perfect SEO with fresher content. Best for: e-commerce, sites with frequent updates.
- Server-Side Rendering (SSR): Pages rendered on each request. Good SEO — Google gets complete HTML. Best for: personalized pages, real-time data. Downside: server cost, TTFB impact.
- Client-Side Rendering (CSR): Pages rendered entirely in the browser. Risky for SEO. Google must use the rendering queue. Best for: authenticated dashboards, apps behind login. Never use for public-facing content pages.
Framework-Specific Recommendations
- Next.js: Use App Router with React Server Components. Default to SSG/ISR. Use
generateStaticParamsfor dynamic routes. - Nuxt 3: Enable hybrid rendering. Use
routeRulesto set SSR/SSG per route pattern. - SvelteKit: Use
prerender = truefor content pages. Use+page.server.tsfor SSR when needed. - Angular: Angular Universal for SSR. Consider Angular SSG with
ng build --prerender.
Crawl Budget Optimization
JavaScript sites waste crawl budget in unique ways:
Problem: Infinite Scroll Creating Infinite URLs
If your infinite scroll appends query parameters or hash fragments, Google may try to crawl thousands of URL variations. Fix: use rel=canonical on paginated views, implement proper pagination with /page/2 URLs, and add pagination to your sitemap.
Problem: Client-Side Routing Creating Uncrawlable Paths
Single-page apps using hash-based routing (example.com/#/about) are not crawlable. Even history API-based routing (example.com/about) requires server-side handling to return proper HTML for each route. Test by disabling JavaScript and loading every key URL — if you get a blank page, Google likely does too.
Problem: API Calls Failing During Render
If your page makes 15 API calls to render, and one fails, Google might index a partially-rendered page — or nothing at all. Solutions:
- Implement graceful error boundaries that still render meaningful content
- Set reasonable timeouts (Google's renderer has a ~5-second budget for API calls)
- Pre-fetch and inline critical data during SSR
Structured Data in JavaScript Apps
Schema.org structured data must be present in the initial HTML response for maximum reliability. While Google says it processes structured data from JavaScript-rendered content, our testing shows 27% lower rich result appearance rates for client-rendered schema vs. server-rendered.
Best practices:
- Inject JSON-LD in the
<head>during SSR - Use framework-specific solutions:
next-seofor Next.js,useSeoMetafor Nuxt - Test with Google's Rich Results Test (which renders JavaScript) AND by viewing source (which doesn't)
The Complete Technical SEO Checklist
Meta Tags and Head Management
- ☐ Unique title tag per page (server-rendered in HTML)
- ☐ Unique meta description per page (server-rendered)
- ☐ Canonical URL on every page (absolute URL, server-rendered)
- ☐ Open Graph and Twitter Card tags (server-rendered)
- ☐ Hreflang tags for multi-language sites (server-rendered)
- ☐ Robots meta tag where needed (noindex for auth pages, filters, etc.)
Performance and Core Web Vitals
- ☐ JavaScript bundle < 200KB gzipped for initial load
- ☐ Code splitting by route (each page loads only its JS)
- ☐ Tree shaking enabled and verified (check bundle analyzer)
- ☐ Images optimized with next/image or equivalent (WebP/AVIF, lazy loading)
- ☐ Font loading optimized (preload, font-display: swap or optional)
- ☐ LCP element server-rendered and visible without JS
Crawlability
- ☐ All public URLs return full HTML from server (disable JS test)
- ☐ XML sitemap includes all indexable URLs
- ☐ Internal links use standard
<a href>tags (not onClick navigation) - ☐ No JavaScript-dependent redirects for critical paths
- ☐ Robots.txt allows crawling of JS, CSS, and image resources
- ☐ No accidental noindex on important pages
Indexing Verification
- ☐
site:search shows expected page count - ☐ Google Search Console Index Coverage shows no unexpected exclusions
- ☐ URL Inspection tool shows rendered page matches expected content
- ☐ No soft 404s (pages that return 200 but show error/empty content)
If you're building a brand alongside your technical implementation, ensure your brand identity translates cleanly into the framework's component system — inconsistent brand implementation across dynamically-rendered pages is a common issue we flag in audits.
Testing Tools and Workflow
Set up automated SEO testing in your CI/CD pipeline:
- Pre-deploy: Lighthouse CI (performance + SEO audits on every PR)
- Post-deploy: Screaming Frog crawl comparing pre/post page counts and status codes
- Continuous: Google Search Console API monitoring for index coverage drops
- Monthly: Full site crawl with JavaScript rendering enabled (Screaming Frog or Sitebulb)
For local businesses like Sacramento contractors, the framework choice itself matters — a WordPress site with good hosting will outperform a poorly-configured Next.js site for local SEO every time. Choose complexity only when it solves a real problem.
The Bottom Line
JavaScript frameworks are not inherently bad for SEO. But they require more technical SEO discipline than traditional server-rendered sites. The rule is simple: if Google can't see your content without executing JavaScript, you have a problem. Test, measure, and verify — assumptions about "Google can render JavaScript" are the most expensive assumptions in modern SEO.
Ready to audit your site?
Run a free SEO scan and get actionable recommendations in seconds.
Start Free Scan →