Mastering Node.js Streams: Efficient Data Handling


This article will help you understand streams and how to work with them. So, don’t be afraid. We can figure this out!
What are Streams?
Streams are one of the fundamental concepts that power Node.js applications. Streams are a way of handling continuous data. Think of a stream like water flowing through a pipe, Instead of handling the entire flow of water all at once we use pipe. Node.js processes it in chunks. This is important for performance, as it allows you to start working with the data as soon as you receive the first chunk, without waiting for the entire set. This makes streams really powerful when working with large amounts of data, for example, a file size can be larger than your free memory space, making it impossible to read the whole file into the memory in order to process it. That’s where streams come to the rescue!
Real world examples
Let’s take a “streaming” services such as YouTube or Netflix for example: these services don’t make you download the video and audio feed all at once. Instead, your browser receives the video as a continuous flow of chunks, allowing the recipients to start watching and/or listening almost immediately.
3. Why streams?
Streams basically provide two major advantages compared to other data handling methods:
Memory efficiency: you don’t need to load large amounts of data in memory before you are able to process it
Time efficiency: it takes significantly less time to start processing data as soon as you have it, rather than having to wait with processing until the entire payload has been transmitted
4. There are 4 types of streams in Node.js:
Writable: streams to which we can write data. For example,
fs.createWriteStream()
lets us write data to a file using streams.Readable: streams from which data can be read. For example:
fs.createReadStream()
lets us read the contents of a file.Duplex: streams that are both Readable and Writable. For example,
net.Socket
Transform: streams that can modify or transform the data as it is written and read. For example, in the instance of file-compression, you can write compressed data and read decompressed data to and from a file.
5. Practical example
Here’s a simple example of reading (Readable stream) a file using streams:
import fs from "node:fs";
const stream = fs.createReadStream("large.txt", "utf8");
stream.on("data", (data) => {
console.log("Received a chunk of data:", data);
)}
stream.end("end", () => {
console.log("Finished reading file");
)}
Here’s a simple example of reading (Writable stream) a file using streams:
import fs from "node:fs";
const stream = fs.createWriteStream("output.txt");
// Write some data to the file
stream.write("Hello\n");
stream.write("Node.js");
// Mark the end of the writable stream
stream.end();
6. Key Events in Streams
Streams are event-driven, which means they emit certain events when specific actions take place. Some key events you’ll work with include:
data: Emitted when a chunk of data is available.
end: Emitted when no more data is available.
error: Emitted when there’s an error with the stream.
finish: Emitted when all the data has been written (used for writable streams).
References
Special thanks to Matteo Colina
Subscribe to my newsletter
Read articles from Shivam Sharma directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Shivam Sharma
Shivam Sharma
I’m an Open-Source enthusiast & Final year pursuing my Bachelors in Computer Science. I am passionate about Kubernetes, Web Development, DevOps & I enjoy learning new things.