Introducing Efficient Valkey-Based Caching for Next.js


Next.js is a popular React framework that enables you to build both static and server-side rendered applications. It’s an excellent choice for creating websites that need to be fast and SEO-friendly.
However, like all frontend frameworks, it faces significant performance challenges, especially when dealing with server-side rendered (SSR) applications.
SSR is inherently slow on Node.js because it’s a CPU-intensive operation: during rendering, the event loop gets blocked, preventing the runtime from accepting and processing other incoming requests.
At Platformatic, we are deeply committed to helping users leverage Node.js in the most efficient way possible.
Last month, we introduced multiple workers for Watt, resulting in a substantial performance boost. Just yesterday, we introduced client-side caching with seamless invalidation, out-of-the-box.
Today, we're taking it a step further by adding efficient caching to your Next.js application.
Introducing Valkey Caching for Next.js
The best way to mitigate the blocking nature of SSR is to cache as much data as possible.
A highly effective caching solution is Valkey, an in-memory data store that delivers excellent performance. It's the perfect companion for a Node.js application. The Node.js driver supports a technique called autopipelining, which automatically batches all operations during an iteration of the event loop into a single network request—without requiring any changes to your code.
Next.js natively supports custom cache handlers for applications. By implementing that interface, you can use any storage system you prefer to manage your server's cache.
Usually, you would need to handle this complexity yourself, implementing the cache handler, managing key uniqueness, and ensuring data is shared across instances.
However, when using Watt and the @platformatic/next service, all of this is handled for you. All you need to do is update your watt.json
file to include the new cache settings:
{
// ...
"cache": {
"adapter": "valkey",
"url": "valkey://redis.example.com:6379"
}
}
That’s it!
Watt will take care of all the complexities of caching, employing best practices for Valkey to ensure maximum performance.
The Problem with Shared Caching
In-memory or file-system-based caching solutions work well—until you start scaling your services horizontally.
Even if your instances are replicas of the same codebase, their runtime states are independent. Each instance maintains its own local cache, meaning users may see different results depending on which instance handles their request.
You can easily demonstrate this issue by creating a Next.js application inside Watt and then create a Docker image:
FROM node:22-alpine
ENV APP_HOME=/home/app/node/
WORKDIR $APP_HOME
COPY ./ ./
RUN npm install && npm run build
EXPOSE 3042
CMD [ "npm", "start" ]
Then, create a custom nginx.conf file like the following one
upstream watt {
server service-1:3042;
server service-2:3042;
}
server {
listen 3042;
location = / {
proxy_pass http://watt;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $remote_addr;
}
location / {
proxy_pass http://service-3:3042;
proxy_set_header Host $host;
proxy_set_header X-Forwarded-For $remote_addr;
}
}
This configuration will use three replicas of the Watt application. The first two are served in a round-robin fashion to render the main page. The third one is used to render assets (so that the main page round-robin policy is not affected).
Finally, create a custom nginx image:
FROM nginx
COPY nginx.conf /etc/nginx/conf.d/default.conf
You can then include your application in a docker-compose.yml
file structured like this:
version: "3.3"
services:
nginx:
image: "custom-nginx"
ports:
- "3042:3042"
depends_on:
- service-1
- service-2
- service-3
service-1:
image: "custom-next"
ports:
- "3043:3042"
service-2:
image: "watt-demo-nextjs-caching-next"
ports:
- "3044:3042"
service-3:
image: "watt-demo-nextjs-caching-next"
ports:
- "3045:3042"
When you start Docker compose and open the page in your browser, you’ll notice that clicking the button will show two alternating results: each replica independently generates its own cached value.
The Solution: Shared Caching
The only way to resolve this issue is to implement a shared cache. While a file-system-based cache is a simple solution, it only works for local replicas and is unsuitable for distributed systems.
A more robust solution is to use our new Valkey-based Next.js cache.
Valkey works seamlessly with both local and remote replicas, ensuring your cache is consistently shared across all instances.
To implement this solution, simply add the following lines to the watt.json file of the Next.js application.
Yes, it’s that simple!
"cache": {
"adapter": "valkey",
"url": "valkey://valkey:6379"
}
Also, add a new service to the Docker compose file:
valkey:
image: "valkey/valkey"
ports:
- "3041:6379"
Now, re-run the example.
You'll notice that only a single result is displayed, as the data is now stored in a globally shared cache.
Wrapping Up
With the introduction of Valkey-based caching for Next.js, we are pushing application performance to new limits.
The combination of Node.js and Valkey, with the power of autopipelining (learn more about it here), results in remarkable performance improvements, significantly reducing the impact of SSR on Next.js application performance.
In Watt, achieving this requires only a simple configuration file change.
Give it a try!
PS. Check out our client-side caching solution that we recently launched.
Subscribe to my newsletter
Read articles from Paolo Insogna directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Paolo Insogna
Paolo Insogna
Node.js TSC Member, Principal Engineer at Platformatic, Polyglot Developer. RPG and LARP addicted and nerd on lot more. Surrounded by lovely chubby cats.