Serverless servers and the challenge of new React architecture
#Reactjs #webRecently, Vercel tweeted an announcement for a new Vercel Functions feature:
Serverless servers: Node.js with in-function concurrency
- Node.js optimized concurrent execution model for Vercel Functions
- Private beta customer @madeonverse reduced their compute cost by 50%
- Available today as opt-in public beta
Intended or not, this tweet garnered a lot of attention and led to countless meme responses thanks to a seemingly nonsensical phrase, “serverless server”. On one hand, that’s just how social media works—people like to have fun and farm engagement. On the other hand, many have strong opinions on Vercel and its products, so any Vercel news is bound to generate some chatters. Regardless, it’s a bit unfortunate that a lot of people seem to have stopped at poking fun at the phrase and skipped the linked blog post explaining the new feature.
Here’s a summary of what the in-function concurrency solves: Each Vercel Function invocation spins up a new function instance, which is a Node.js process. The original setup was not able to take full advantage of concurrency in Node.js due to the isolated nature of each instance. This led to inefficiency when the job involves async calls—each function instance would simply sit idle while waiting for the promise to settle. The new in-function concurrency solves this by forwarding incoming function invocation requests to existing instances so that each instance can take on multiple requests concurrently.
This is clearly a big efficiency gain and many Vercel Function users are already seeing a significant reduction in compute cost. But to me, this new feature is also a telltale of how React’s new full-stack architecture vision needs more than a framework, and how a product like Vercel’s is almost inevitable.
The cornerstone of the new React architecture is that a React app consists of both a server part and a client part. React app developers are encouraged to keep data management to the server part and keep the client part lean. This change has at least two consequences: more traffic to the server (to render components on server) and more async calls on the server (from components on server). Think of a scenario where the app is mutating some server data and updating a view based on the latest data. Previously, this could be handled mostly on the client side by directly calling external API endpoints and handling view update. In the new React architecture, this would generally go through the server part of React tasked to both fetch data and render components using the data.
The contrast is more stark in the case of Next.js. App Router, the de-facto standard implementation of the full-stack React architecture, has added support for not only server-side React but also nested routes, which encourages developers to build apps with more granular routes to mirror their data requirements. When combined with App Router’s every-route-starts-from-server approach, we likely get more server traffic where all relevant data fetching inside all components on server need to be resolved to reconstruct the whole server tree. More traffic to the server (to render components on server) and more async calls on the server (from components on server).
Next.js mitigates the impact by its sophisticated caching mechanism for data fetching as well as component rendering. But that can only go so far for dynamic apps that frequently update their data and views; to be truly scalable, it needs support from an infrastructure for distributed deployment. This is where Vercel enters.
So how does this story relate to the serverless servers and in-function concurrency? Deploying Next.js on Vercel means using Vercel Functions to serve traffic:
On Vercel, you can server-render Next.js applications in either the Node.js runtime (default) with Serverless Functions or the Edge runtime with Edge Functions.
Before the App Router days, I suspect that the cost of the one-instance-per-invocation setup wasn’t so ostensible but the increased server needs of App Router brought attention to this limitation. Given how so many apps deployed on Vercel are Next.js apps and the growing popularity of App Router, I suspect that the dramatic reduction in cost from in-function concurrency are from serving large Next.js apps on App Router.
The bigger story, of course, is the increasingly strong vertical alignment between React, Next.js and Vercel. No nefarious conspiracy is required for this alignment. Rather, it illustrates the challenge of React’s new architecture that seeks to embrace the server-client model in a particular way. When React apps were client-side only, the cost of its component orientation was largely hidden. Sure performance, but the user bore the brunt of cost in UX and its impact on developers was often indirect at best. As React moves to the server side, this cost starts to show up on bills. Covering both client and server sides is a framework’s job, but making it efficient seems to require support from the infrastructure. Can this be changed—perhaps by a new approach to the the full-stack React?
Update (2024-10-08): Note on Theo's feedback
During his coverage of the Serverless Server news on the latest livestream, Theo (@t3dotgg) kindly featured this post and offered his feedback. (Thank you!) Theo pointed out that the real gain from in-function concurrency lies in its ability to handle multiple slow async requests concurrently, which has little to do with rendering components on server that is synchronous in nature. This is entirely correct! In-function concurrency has no business in optimizing synchronous tasks running on Vercel Function instances such as pure rendering.
With that in mind, I believe there’s still more to the story: the full-stack React recommends moving data fetching from client-side React to server-side React, where we often
await
data fetching requests on rendering async components on server. Here’s the first code example from Next.js docs page on data fetching verbatim:async function getData() { const res = await fetch("https://api.example.com/..."); // The return value is *not* serialized // You can return Date, Map, Set, etc. if (!res.ok) { // This will activate the closest `error.js` Error Boundary throw new Error("Failed to fetch data"); } return res.json(); } export default async function Page() { const data = await getData(); return <main></main>; }
Rendering such components is no longer pure calculation. And with in-function concurrency, each
await
in the component body is an opportunity for Vercel to serve more requests using the same function instance. The longer each data fetching call takes, the greater the save.That said, I failed to acknowledge that Next.js has been capable of running user code on server via API routes,
getServerSideProps
, and so on—for years! So if we’re migrating a Next.js app deployed on Vercel from Page Router fetching data through API routes, etc. to App Router fetching data from components on server, then there wouldn’t be much difference in overall compute cost regardless of in-function concurrency. This likely invalidates the following statement I made: “I suspect that the dramatic reduction in cost from in-function concurrency are from serving large Next.js apps on App Router.”All in all, however, I believe that the basic point of this post still stands—that the full-stack React architecture and its data fetching pattern have meaningful implications on server cost at scale, and effectively addressing them needs more than a framework. It needs a dedicated deployment infrastructure like Vercel’s.