Skip to content

[pull] main from facebook:main#485

Merged
pull[bot] merged 1 commit intocode:mainfrom
facebook:main
Feb 19, 2026
Merged

[pull] main from facebook:main#485
pull[bot] merged 1 commit intocode:mainfrom
facebook:main

Conversation

@pull
Copy link

@pull pull bot commented Feb 19, 2026

See Commits and Changes for more details.


Created by pull[bot] (v2.0.0-alpha.4)

Can you help keep this open source service alive? 💖 Please sponsor : )

…yload (#35776)

## Summary

Follow-up to vercel/next.js#89823 with the
actual changes to React.

Replaces the `JSON.parse` reviver callback in `initializeModelChunk`
with a two-step approach: plain `JSON.parse()` followed by a recursive
`reviveModel()` post-process (same as in Flight Reply Server). This
yields a **~75% speedup** in RSC chunk deserialization.

| Payload | Original (ms) | Walk (ms) | Speedup |
|---------|---------------|-----------|---------|
| Small (2 elements, 142B) | 0.0024 | 0.0007 | **+72%** |
| Medium (~12 elements, 914B) | 0.0116 | 0.0031 | **+73%** |
| Large (~90 elements, 16.7KB) | 0.1836 | 0.0451 | **+75%** |
| XL (~200 elements, 25.7KB) | 0.3742 | 0.0913 | **+76%** |
| Table (1000 rows, 110KB) | 3.0862 | 0.6887 | **+78%** |

## Problem

`createFromJSONCallback` returns a reviver function passed as the second
argument to `JSON.parse()`. This reviver is called for **every key-value
pair** in the parsed JSON. While the logic inside the reviver is
lightweight, the dominant cost is the **C++ → JavaScript boundary
crossing** — V8's `JSON.parse` is implemented in C++, and calling back
into JavaScript for every node incurs significant overhead.

Even a trivial no-op reviver `(k, v) => v` makes `JSON.parse` **~4x
slower** than bare `JSON.parse` without a reviver:

```
108 KB payload:
  Bare JSON.parse:    0.60 ms
  Trivial reviver:    2.95 ms  (+391%)
```

## Change

Replace the reviver with a two-step process:

1. `JSON.parse(resolvedModel)` — parse the entire payload in C++ with no
callbacks
2. `reviveModel` — recursively walk the resulting object in pure
JavaScript to apply RSC transformations

The `reviveModel` function includes additional optimizations over the
original reviver:
- **Short-circuits plain strings**: only calls `parseModelString` when
the string starts with `$`, skipping the vast majority of strings (class
names, text content, etc.)
- **Stays entirely in JavaScript** — no C++ boundary crossings during
the walk

## Results

You can find the related applications in the [Next.js PR
](vercel/next.js#89823 I've been testing this
on Next.js applications.

### Table as Server Component with 1000 items

Before:

```
    "min": 13.782875000000786,
    "max": 22.23400000000038,
    "avg": 17.116868530000083,
    "p50": 17.10766700000022,
    "p75": 18.50787499999933,
    "p95": 20.426249999998618,
    "p99": 21.814125000000786
```

After:

```
    "min": 10.963916999999128,
    "max": 18.096083000000363,
    "avg": 13.543286884999988,
    "p50": 13.58350000000064,
    "p75": 14.871791999999914,
    "p95": 16.08429099999921,
    "p99": 17.591458000000785
```

### Table as Client Component with 1000 items

Before:

```
    "min": 3.888875000000553,
    "max": 9.044959000000745,
    "avg": 4.651271475000067,
    "p50": 4.555749999999534,
    "p75": 4.966624999999112,
    "p95": 5.47754200000054,
    "p99": 6.109499999998661
````

After:

```
    "min": 3.5986250000005384,
    "max": 5.374291000000085,
    "avg": 4.142990245000046,
    "p50": 4.10570799999914,
    "p75": 4.392041999999492,
    "p95": 4.740084000000934,
    "p99": 5.1652500000000146
```

### Nested Suspense

Before:

```
  Requests:  200
  Min:       73ms
  Max:       106ms
  Avg:       78ms
  P50:       77ms
  P75:       80ms
  P95:       85ms
  P99:       94ms
```

After:

```
  Requests:  200
  Min:       56ms
  Max:       67ms
  Avg:       59ms
  P50:       58ms
  P75:       60ms
  P95:       65ms
  P99:       66ms
```

### Even more nested Suspense (double-level Suspense)

Before:

```
  Requests:  200
  Min:       159ms
  Max:       208ms
  Avg:       169ms
  P50:       167ms
  P75:       173ms
  P95:       183ms
  P99:       188ms
```

After:

```
  Requests:  200
  Min:       125ms
  Max:       170ms
  Avg:       134ms
  P50:       132ms
  P75:       138ms
  P95:       148ms
  P99:       160ms
```

## How did you test this change?

Ran it across many Next.js benchmark applications.

The entire Next.js test suite passes with this change.

---------

Co-authored-by: Hendrik Liebau <mail@hendrik-liebau.de>
@pull pull bot locked and limited conversation to collaborators Feb 19, 2026
@pull pull bot added the ⤵️ pull label Feb 19, 2026
@pull pull bot merged commit f247eba into code:main Feb 19, 2026
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant

Comments