Undici v7 is here


Originally released in 2018, the modern HTTP client library for Node.js—Undici—has seen remarkable growth in 2024. Downloads skyrocketed from over 189 million in 2023 to a staggering >437 million (January–November 2024), highlighting its critical role in the Node.js ecosystem.
Today, the Node.js Undici Working Group is pleased to announce the release of Undici v7.
This release introduces stricter compliance with the fetch() specification, WebSocketStream, a groundbreaking caching implementation, and customizable interceptors to supercharge your HTTP workflows.
The team has also made several optimizations and improvements to align with upcoming Node.js releases.
In this blog, we will explore what makes this version a must-have upgrade.
Stricter fetch() spec compliance
Back when we first considered fetch() to Node.js, we needed to provide a migration path for all the users of the various polyfills that where on the NPM registry. Therefore, we added support for third party Blob, FormData, and AbortController. This support was tested but not really documented. However, this causes quite a few maintenance nightmares as those polyfills do not follow the spec.
In Undici v7, we decided to remove support for those entirely in our fetch() implementation. Therefore, only the provided classes will work with it.
As an example, the following will not work anymore:
import { fetch } from 'undici'
import FormData from 'form-data'
const body = new FormData()
body.append('file', new Blob(['hello world'], { type: 'text/plain' }), 'hello.txt')
await fetch('http://localhost:3000/upload', {
method: 'POST',
body
})
Instead, do the following:
import { fetch, FormData } from 'undici'
const body = new FormData()
body.append('file', new Blob(['hello world'], { type: 'text/plain' }), 'hello.txt')
await fetch('http://localhost:3000/upload', {
method: 'POST',
body
})
The same logic also applies to Request, Response, Headers, etc. This would apply to Node.js in v24 and onwards.
WebSocketStream
After shipping a standard-compliant WebSocket implementation in Node.js v22, @KhafraDev has added WebSocketStream in https://github.com/nodejs/undici/pull/3560.
Here is an example:
import { WebSocketStream } from "undici";
const ws = new WebSocketStream('ws://localhost:3000/')
const { readable, writable } = await ws.opened
const writer = writable.getWriter();
writer.write('hello world')
for await (const value of readable) {
const parsed =new TextDecoder().decode(value)
console.log('received', parsed)
If (value === ‘end’) {
writer.close()
}
}
*Note that this implementation is experimental.
Composing Interceptors
Starting from Undici v6.20, we have added a new API to Undici to completely customize the lifecycle of requests, enabling maximum flexibility: compose(). This allows you to chain multiple interceptor together, making it incredibly easy to to customize the behavior.
Here is an example on how to use the interceptors:
import {
Agent,
interceptors,
setGlobalDispatcher,
getGlobalDispatcher
} from "undici";
setGlobalDispatcher(getGlobalDispatcher().compose(
interceptors.redirect({ maxRedirections: 3, throwOnMaxRedirects: true }),
interceptors.retry({
maxRetries: 3,
minTimeout: 1000,
maxTimeout: 10000,
timeoutFactor: 2,
retryAfter: true,
})
))
// Works with native fetch without modifications.
await fetch(...)
Here are the pre-built interceptors:
redirect, to automatically follow HTTP 304 and Location headers.
retry, to automatically retry failed requests
dump, to dump the body of all requests
dns, to cache the DNS lookup for an origin
cache, to implement RFC-9111 compatible cache (see next section)
You can find the documentation of all the interceptors here.
Caching
Undici implements caching as described in RFC-9111 and as a shared cache.
The cache interceptor hooks into every request that is made through the client that it is applied to. It’s essentially middleware between the call to request()
or fetch()
and the code that sends the request to the service.
For each request, the interceptor reaches out to its cache store to see if there is a cached response that we can respond with. If there is a response, the interceptor replies with it.
If there isn’t a response, the interceptor sends the request further down the chain of interceptors where it will eventually be sent out to the server. The interceptor attaches a handler to the response and determines if we can cache it. The minimum criteria for a response to be cached is the cache-control header existing and having the public
directive.
import { Agent, setGlobalDispatchers, interceptors, request } from 'undici'
setGlobalDispatcher(getGlobalDispatcher().compose(
interceptors.cache(/* optional object for configuring */))
)
await request(‘http://localhost:3000’)
// Native fetch() support
await fetch(‘http://localhost:3000’)
We are also adding a SQLite cache store if Node.js’ experimental SQLite API is available. It stores the response in a SQLite database that can either be in-memory or file-based, meaning it can be used to share cache responses with other Node processes.
import { request, setGlobalDispatcher, getGlobalDispatcher, interceptors, cacheStores } from 'undici'
// you will need to run this file with --experimental-sqlite setGlobalDispatcher(getGlobalDispatcher().compose(
interceptors.cache({
store: new cacheStores.SqliteCacheStore({
location: './cache.db'
})
})
))
const res = await request('http://localhost:3000/')
console.log(res.statusCode)
console.log(await res.body.text())
When composing, do not mix and match Undici versions
Node.js core ships with its own version of Undici, which is older and of the previous release line (v6 and v5, respectively).
If you want to use the compose()
feature, you must install a fresh undici.Agent
, otherwise the newer Undici v7 interceptors will crash.
'use strict'
// Do not initialize fetch() before requiring/importing undici
// Uncommenting the following line will break your code
// fetch('https://example.com').catch(console.error)
const undici = require('undici')
undici.setGlobalDispatcher(undici.getGlobalDispatcher().compose((dispatch) => {
return (opts, handler) => {
if (!handler.onRequestStart) {
throw new Error('Handler must implement onRequestStart')
}
return dispatch(opts, handler)
}
}))
undici.request('https://example.com').then((res) => {
console.log(res.statusCode)
return res.body.dump()
}).catch(console.error)
(This example uses Commonjs because it’s not possible to replicate this using ESM).
Long Term Support
Given that Undici is embedded in Node.js for fetch()
, we closely follow Node.js release schedule.
Currently Node.js v18 ships with Undici v5, which means v5 will go out of LTS in April v2025.
Note that Undici v6 had no visible breaking changes, so Node v18 could technically be upgraded, too.
Node.js v20 and v22 ships with Undici v6, so we plan to support Undici v6 until the 30th of April 2027.
As far as Undici v7 goes, we plan to embed it in Node.js v24.
We’ll likely ship new releases of Undici in the coming years, which may or may not be compatible to what is exposed in Node.js, as a results it’s hard to predict how long Undici v7 will be supported for.
To recap:
Undici v5 will go EOL on the 30th of April 2025
Undici v6 will go EOL on the 30th of April 2027
Undici v7 does not have a planned EOL yet
Other Relevant Changes
Assume blocking unless HEAD
It’s no secret that undici supports HTTP pipelining, allowing multiple requests to be sent before the output of the first is received.
We are changing this default behavior, and turning on the blocking
option by default: in this way, Undici will wait to receive the headers of the response before sending the next request.
Upgrade to llhttp v9
The new version of the HTTP parser switches to an always strict parsing logic.
If you only connect to HTTP specification compliant servers, this change will not impact you. If this is an important issue for you, we recommend to check this out, and contribute back.
Removed throwOnError
option
undici.request()
used to have a throwOnError
option to automatically throw an exception for status code in the 4xx-5xx ranges.
However, this is now replaced by an interceptor, and there is no need for this specific option anymore.
In order to implement the same functionality, you can now use an interceptor:
import { createServer } from 'node:http'
import { once } from 'node:events'
import { interceptors, getGlobalDispatcher, request } from 'undici'
const server = createServer((req, res) => {
console.log('request', req.url)
res.statusCode = 404
res.end('hello world')
})
server.listen(3000)
await once(server, 'listening')
const dispatcher = getGlobalDispatcher().compose(
interceptors.responseError()
))
This will throw await request('http://localhost:3000/', { dispatcher }).
Benchmarks
These benchmarks were taken on Node.js v22.11.0 on dedicated hardware, using 50 TCP connections with a HTTP/1.1 pipelining factor of 10.
*Remember to benchmark your use case because your mileage may vary.
Call for contributors
Undici is a community project! Without all the contributors who have worked incredibly hard to ship this version in the last few months, this release wouldn't be possible. We’d like to thank the fantastic people in the Undici Working Group in Node.js:
@mcollina (myself)
We need you as well! There are plenty of “good first issues” in the project you can start helping with.
A note from Platformatic
The Open Source ecosystem thrives on the tireless efforts of its maintainers. To ensure its continued viability, we must all contribute to supporting the invaluable work they do.
This is why we have signed up to the Open Source Pledge, and why we have supported this exciting Undici release.
For contributors out there- thank you for your tireless and often thankless work.
For companies out there who are interested in supporting the work of maintainers, check out how you can get involved with the Open Source Pledge.
Subscribe to my newsletter
Read articles from Matteo Collina directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
