JavaScript Is Costing You Social Traffic

By ShareScan.ioPublished 3 min readCategory Technical SEOUpdated

You shipped the campaign. The tags are perfect. The traffic is ready. Then your product page gets shared — and the preview shows your homepage. The problem isn’t marketing. It’s rendering.

A developer struggling with missing preview data, and an online business owner seeing money being lost.

You’ve set up all your tags, carefully curated to match your SEO strategy.
You’ve aligned titles and descriptions with paid campaigns.
You’ve launched the ads.
You’re waiting for traffic to pour in.

Then someone shares your product page.

And the preview shows the homepage title.
Or no image.
Or a generic fallback description.

Nothing breaks in the browser.
But the share preview is wrong.

This is not a marketing problem.
It’s a rendering problem.

And in most cases, the root cause is simple:

Your Open Graph metadata depends on client-side JavaScript.


The Hidden Assumption Behind Client Rendering

Modern frontend stacks assume that everything runs in a browser.

But social platforms don’t operate like browsers.

When a link is shared on platforms like Facebook, LinkedIn, Slack, or X, the platform:

  1. Fetches your URL.
  2. Parses the raw HTML.
  3. Extracts Open Graph tags from the <head>.
  4. Stops.

There is usually:

  • No hydration.
  • No React lifecycle.
  • No Vue reactivity.
  • No waiting for async data.
  • Often no JavaScript execution at all.

If your og:title, og:description, or og:image only appear after client rendering, they are invisible to the crawler.

What Happens in a Client-Rendered App

A typical SPA response looks like this:

CodeCode exampleformatted
ts
1<html>2  <head>3    <title>My App</title>4    <script src="/bundle.js"/>5  </head>6  <body>7    <div id="root"/>8  </body>9</html>

The actual metadata is injected later:

  • Router resolves route
  • Data is fetched
  • State updates
  • Head manager runs
  • Meta tags are injected

By the time your <meta property="og:title"> exists, the crawler has already left.

This leads to:

  • Homepage metadata appearing on product pages
  • Missing og:image
  • Blank descriptions
  • Inconsistent previews across platforms

And because platforms cache aggressively, the wrong preview may persist long after you “fix” the page.

Why SSR Fixes This Structurally

With server-side rendering (SSR), the HTML response already contains the final metadata:

CodeCode example
ts
1<head>2  <title>Red Running Shoes</title>3  <meta property="og:title" content="Red Running Shoes" />4  <meta property="og:description" content="Ultra-lightweight performance shoes." />5  <meta property="og:image" content="https://example.com/images/red-shoe.jpg" />6</head>

No waiting.
No JavaScript dependency.
No race conditions.

The crawler receives the same metadata a user would see after full render.

That makes SSR deterministic.

The Race Condition Problem in CSR

Even in environments where crawlers attempt limited JavaScript execution, you still face:

  • Slow API responses
  • Conditional rendering logic
  • Client-side route transitions
  • Lazy-loaded head managers
  • Timeout constraints

If metadata updates occur after the crawler times out, the preview fails.

This is not theoretical — it is extremely common in ecommerce, content-heavy SPAs, and campaign landing pages.

Caching Makes Failures Sticky

Platforms cache previews.

If the first crawl fails to extract correct metadata, that broken snapshot may persist.

For example:

  • Facebook caches link previews aggressively.
  • LinkedIn does the same.
  • Slack may require manual refresh.

SSR reduces the probability of first-fetch failure dramatically.

You Don’t Need Full SSR — Just Deterministic Metadata

A common objection:

“Full SSR is expensive.”

You don’t necessarily need to SSR the entire page.

You need the <head> metadata to be correct in the first byte response.

This can be achieved via:

  • Full SSR (e.g., Tanstack Start, Next.js, Nuxt, Remix)
  • Static generation
  • Edge rendering
  • Metadata-only server rendering
  • Bot-aware rendering layer

The key principle:

Open Graph is an HTML contract.

If the metadata is not present in the server response, you are relying on undefined crawler behavior.

Reliability Comparison

Rendering ModelOG ReliabilityTypical Failure Mode
Pure CSRLowJS not executed, async timing
HybridMediumRoute-level inconsistencies
Full SSRHighMisconfiguration only
StaticVery HighRare caching issues

The Bottom Line

Client-rendered metadata works accidentally.
Server-rendered metadata works predictably.

If social traffic matters — especially for ecommerce and campaign-driven pages — deterministic HTML is not optional.

It’s infrastructure.

TRY SHARESCAN

Run a free 10-URL scan on your pages

Paste a few URLs (or a domain/sitemap) and run the same metadata checks we use for social preview QA and regression monitoring.

See a sample report

No signup for your first scan. Open the report, review issues, then connect Slack if you want alerts.

After scan completion, connect Slack and send a test report.

Up to 10 URLs. We will dedupe and validate automatically. Prepared 0 / 10 unique URLs.