Bringing HTTP Caching to Node.js

flakey5flakey5
6 min read

When it comes to building performant software that communicates effectively with other services, caching is essential. While HTTP caching is a common practice, Node.js has historically lacked a built-in, standards-compliant solution for client-side HTTP caching—until now.

With Undici v7.0.0, we’re excited to introduce client-side HTTP caching, designed to align with the RFC-9111 specification.

This feature empowers developers to seamlessly integrate caching into their HTTP workflows, making your applications faster and more efficient out of the box.

What is Undici?

Undici is a HTTP/1.1 client that is built into Node.js and powers its fetch() implementation. It’s written purely in JavaScript and specifically for the Node.js runtime.

What is caching?

Many responses to HTTP requests can be cached for some period of time. This means the response can be stored for future reuse by requests to the same resource.

What caching enables is more efficient software. By reusing a response, it is more often than not faster than if a request were made to the origin. This saves time, energy, and resources that could be spent on other things.

Caching can exist at multiple layers throughout the HTTP stack. It is most commonly present in browsers and CDN providers such as Cloudflare or Fastly. There are different types of caches as well, with a shared cache allowing for response reuse by multiple different parties (i.e. CDNs) and a private cache being meant to only serve an individual (like in your browser).

Caching can also distribute the load for the origin server. If a response is cached then later reused, the origin server only handles the original request while that response is still fresh.

How do I use it?

As of the time of this blog's publication, Undici's caching system is opt-in. You can enable it by using its interceptor:

import { getGlobalDispatcher, setGlobalDispatcher, interceptors, request } from 'undici'

setGlobalDispatcher(getGlobalDispatcher().compose(
  interceptors.cache(/* optional object for configuring */))
)

await request(‘http://localhost:3000’)

// Native fetch() support
await fetch(‘http://localhost:3000’)

Options

The cache interceptor can be configured with these options as well:

  • store - optional, the underlying cache store.

  • methods - optional, the safe HTTP methods to cache

  • cacheByDefault - optional, the amount of time in seconds to store responses without explicit expirations

  • type - optional, the type of cache

How does it work?

Undici fundamentally revolves around the concept of Dispatchers. A Dispatcher is what sends a request to a server and returns the response back to the client.

Undici allows for a plugin-esque design to hook into a dispatcher with an interceptor. When an interceptor is attached to a dispatcher, it acts essentially like middleware for each request made with that dispatcher. An interceptor can also attach a handler in order to get the response of a request before it is returned to the user.

For example, someone could add an interceptor that just logs that a request was made and then sends it to the next interceptor for it to be dispatched further. However, instead of sending it to the next interceptor, it could just send a response then and there. That’s what Undici’s cache implementation does.

Undici has a cache interceptor that hooks into every request that is made to its dispatcher. When a request is made, the interceptor reaches out to its cache store to see if that request is cached or not. If it is, it will reuse that response. If it isn’t, it dispatches the request further and then tries to cache the response with a handler.

As far as determining if a response can be cached, Undici’s implementation by default requires a directive telling when the response expires. This can be done with the s-maxage or max-age directives in the response’s Cache-Control header or by the Expires header. However, there is an option for caching all requests by default regardless if there’s an explicit expiration time (see Options). Undici also requires that the origin property be defined on each request.

Undici determines if a request can be satisfied by a cached response by a few factors. The first is the actual cache key for the response. This consists of the response’s origin, path, and method. For instance, a GET request to https://example.com/foo’s cache key would have example.com as the origin, /foo as the path, and GET as the method. Undici also takes the response’s Vary header into account if it is present.

Undici’s cache implementation understands numerous caching directives as well on both the request and response side, including basic ones such as only-if-cached to more complex such as stale-while-revalidate and stale-if-error.

Cache Stores

A cache store is the underlying storage interface for the responses. It is responsible for storing, retrieving, and ultimately deciding what response (if any) is used as the cached response. A cache store is expected to comply with RFC-9111.

Undici has two built-in cache stores: in-memory and SQLite, with memory being the default. They are both publicly exposed and accessible under the cacheStores property exposed from Undici:

import { request, setGlobalDispatcher, getGlobalDispatcher, interceptors, cacheStores } from 'undici'

// you will need to run this file with --experimental-sqlite or use the memory store
setGlobalDispatcher(getGlobalDispatcher().compose(
  interceptors.cache({
    // or new cacheStores.MemoryCacheStore()
    store: new cacheStores.SqliteCacheStore({
      location: './cache.db'
    })
  })
))

const res = await request('http://localhost:3000/')
console.log(res.statusCode)
console.log(await res.body.text())

This is a basic Node.js server to test this behavior:

import { createServer } from 'node:http'

let count = 0
const server = createServer((req, res) => {
  console.log('request', req.url)
  res.setHeader('Cache-Control', 'public, max-age=60')
  res.end('hello world ' + count++)
})

server.listen(3000)

The in-memory cache store is the default and works just as to be expected: it’s a simple storage interface for storing responses in-memory. Its documentation can be found here.

The SQLite cache store makes use of Node.js’ experimental SQLite API (with the --experimental-sqlite flag). It stores the response in a SQLite database that can either be in-memory or file-based, meaning it can be used to share cache responses with other Node processes. Its documentation can be found here.

Wrapping up- a note from Platformatic

This feature represents a milestone in bringing standards-compliant HTTP caching to the Node.js ecosystem, made possible through contributions from the community and Undici’s maintainers. Node.js developers now have a powerful, standards-aligned tool to build faster, more resilient applications. Whether you’re optimizing API performance or sharing cached data across processes, this feature sets a new benchmark for client-side caching in Node.js.

“We thought client-side HTTP caching made no sense without a Node.js-compatible solution. That’s why we sponsored Aaron Snell to lead this effort– it’s thanks to his hard work that this is now up and running” said Matteo Collina, co-founder & CTO at Platformatic.

We’d also like to extend a special thanks to Robert Nagy and nxtedition for reviewing this release and for contributing to the initial implementation, as well as the many other maintainers of Undici that helped with their inputs and reviews.

Start exploring this release in more detail here.

To learn more about Platformatic’s commitment to open source initiatives, check out our commitment to the OSS Pledge.

3
Subscribe to my newsletter

Read articles from flakey5 directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

flakey5
flakey5