Next.js: Stream rendering to reduce TTFB and CPU load

Created on 19 Feb 2017  ·  36Comments  ·  Source: vercel/next.js

I suggest to stream-render pages > 50KB (hypothetical stream overhead) to reduce TTFB and CPU load.

It would supersede https://github.com/zeit/next.js/pull/767. Preact does not support streaming (https://github.com/developit/preact/issues/23#issuecomment-226954130).

story

Most helpful comment

Any news about this?

All 36 comments

One thing to mention..

Stream rendering usually won't add much CPU improvements since the amount of work to be done is the same. But it'll reduce the response time.

It's a pretty good idea to provide a way to customize the SSR rendering system. But I think for now, we'll stick with the React's renderToString() methods by default.

This is something we could do after 2.0.

Stream rendering usually won't add much CPU improvements since the amount of work to be done is the same.

_from aickin/react-dom-stream_

One call to ReactDOM.renderToString can dominate the CPU and starve out other requests. This is particularly troublesome on servers that serve a mix of small and large pages.

Wouldn't streaming sensibly reduce memory allocation and CPU usage for large pages by being both asynchronous and partial?

Thought this was along the same lines. Has anyone tried https://github.com/FormidableLabs/rapscallion ?

It provides a streaming interface so that you can start sending content to the client immediately.

Other features from the docs:

  • Rendering is asynchronous and non-blocking.
  • Rapscallion is roughly 50% faster than renderToString.
  • It provides a streaming interface so that you can start sending content to the client immediately.
  • It provides a templating feature, so that you can wrap your component's HTML in boilerplate without giving up benefits of streaming.
  • It provides a component caching API to further speed-up your rendering.

Added an example of Rapscallion in #2279... can confirm that Rapscallion + Next is insane. Streamed/promise-based render is awesome, but Component-level caching is a game-changer for us... :godmode:

Now that React 16 has its own renderToNodeStream, it would be a huge advantage for next.js to add an option to use it instead of renderToString. What do you think, @timneutkens?

It's on our list of things to add already 👍

Any news about this?

Any news?

Next.js needs to expose custom render (with renderToString as default renderer) in order for user to use their custom async renderer i think.
Lack of this feature forced me to use razzle to use an async renderer though :( (its DX is nowhere near NextJS, but i had to accept that to go on).

I love everything about Next.js except two things:

  • Custom async renderer.
  • Custom babel config for both server and client.

any roadmap / plan of streaming rendering support? so expected have this in next.js .  

The is pending on the React team implementing React Fizz / their plan for it.

@timneutkens What the issue, PR to track here?

From Facebook's blog post, published on August 8th 2019
https://reactjs.org/blog/2019/08/08/react-v16.9.0.html

An Update on Server Rendering
We have started the work on the new Suspense-capable server renderer, but we don’t expect it to be ready for the initial release of Concurrent Mode. This release will, however, provide a temporary solution that lets the existing server renderer emit HTML for Suspense fallbacks immediately, and then render their real content on the client. This is the solution we are currently using at Facebook ourselves until the streaming renderer is ready.

For anyone still waiting on server streaming support :)

Is there any update or any other method to implement renderToNodeStream in next.js ?

Is there any update?

<3

Any update?

@StarpTech I'd looked a bit into this (curious for this feature as well!) and it looks like the react team is working on something called react-flight, which will probably be the base for the streaming solution we are waiting for here :)

react-flight:

This is an experimental package for consuming custom React streaming models.

The relevant PR's that shine some light on the inner workings, interpreted by me (not an expert in any of this 🙈 )
#17285: Basic api for flight, the server should be able to stream everything as a string, but leave placeholders for the chunks that are asynchronously resolved on the server. An incomplete, yet interesting, syntax for how react would know from a stream what data type it actually represents is over here.

#17398 More recent PR, adds an api for Chunks so (if you're feeling lucky) you could try that part out yourself. Not sure how everything would come together but nevertheless I'm kinda happy with seeing all this work being done :)

_This might be slightly off-topic, but hopefully interesting for people subscribing to this issue :)_

@pepf thanks for the info!

Hm. Thank u all guys, interesting info. I am just thinking why should NextJS wait for React support SSR for suspense and stuff, and not just use streamAsString now?

@arunoda I think it will reduce memory consumption, very important for low memory lambda functions or Cloudflare Workers.

Any news about this?

Any news?

Any news guys? :P

Given that react suspense feels like it may never come out, is there any way we can revisit this? I'm seeing initial render times of 800-1000ms on fairly generic, low-content pages (that are rendered from a serverless function on vercel). Streaming the HTML in theory could bump up that initial point of contact and lead to much much faster perceived first-load ux.

image

image

Is that a limitation of Vercel rather than NextJS? Does Vercel fall victim to the same sort of cold-startup that Lambda does? My team runs a bunch of content heavy sites and our timings are under 50ms for the whole request. I don't think streaming is going to be a silver bullet here.

I don't expect it to be a silver bullet, but it certainly would be a welcome improvement.

50ms sounds minuscule and relatively too insignificant to optimize compared to the mentioned cold-start times, but it is not when rendering at edge with something like Cloudflare Workers (it is as much as the ping to the nearest Cloudflare edge location, or at least half), Cloudflare Workers respond very quickly, typically in under 5 milliseconds, when cold starting. In contrast, both Lambda and Lambda@Edge functions can take over a second to respond from a cold start.
I know this does not make a better case for Varcel (which builds its own cdn) employed developers of next.js to prioritize this.

Is there any documentation or repositories where people have successfully deployed next.js to cloudflare workers? I did some googling and couldn’t find much on it. That would be amazing.

Sent from my phone

On Jul 5, 2020, at 5:47 PM, Wis notifications@github.com wrote:


50ms sounds really low and relatively insignificant to optimize compared to the mentioned cold-start times, but 50ms is a lot when rendering at edge with something like Cloudflare Workers (it is as much as the ping to the nearest Cloudflare edge location, or at least half), Cloudflare Workers respond very quickly, typically in under 5 milliseconds, when cold starting. In contrast, both Lambda and Lambda@Edge functions can take over a second to respond from a cold start.
I know this does not make a better case for Varcel (which builds its own cdn) employed developers of next.js to prioritize this.


You are receiving this because you commented.
Reply to this email directly, view it on GitHub, or unsubscribe.

50ms sounds minuscule and relatively too insignificant to optimize compared to the mentioned cold-start times

To clarify, I wasn't complaining about my 50ms response times - I was just pointing out that NextJS' SRR is relatively performant even of north of 3k requests / minute for content-heavy pages, so Switz's problem might lay elsewhere.

Is there any documentation or repositories where people have successfully deployed next.js to cloudflare workers

I would also be interested in that. We're currently running in Fargate but pushing our app to the edge would be the next logical step.

Eu fiz todas as melhorias possiveis dentro do meu HTML e o tempo de resposta do servidor esta altissimo! :(

@huggler you can combine this example with cacheable-response. You can use Redis (or in-memory cache for example) to store html in cache. It improves server response time.

Is there any documentation or repositories where people have successfully deployed next.js to cloudflare workers? I did some googling and couldn’t find much on it. That would be amazing.

@switz @tills13 did you check out https://fab.dev/ ? We played with it early 2020 and it was in pre-release state, but they seem to have come pretty far. One of the limitations back them was Cloudflare itself, but things might have changed by now.

Haven't looked in a while. Would have to reevaluate. Last time I looked, there were some pretty serious tradeoffs.

I am also keeping my eye on https://github.com/flareact/flareact.

Any update on this?

Was this page helpful?
0 / 5 - 0 ratings