Solving memory leak in NodeJs server while handling Streams
Hey everyone, recently I solved one such problem that was bugging me for quite sometime and I was not paying attention to it and running after quick fixes.
Background story: I have a NodeJs backend server running for my app Downify thats streams the music into the app for downloading. Once requested, it streams the music and the app has a downloader that saves the mp3 into the user’s device. Simple API, no brainer. The problem was after every few days, the server’s memory and CPU usage used to go till 100% and the server used to stop responding.
Temporary fix:
Initially, I was not paying attention to this and was simply rebooting the server whenever this happened. I did not dig into the issue. But I knew, this was a short-term solution, but it was not sustainable or efficient in the long run.
Digging deeper into the issue:
This is the API handler where I take a youtube video id as query param and send the audio format of the same video.
Can you guess what was the issue here?
export const directStream = async (req, res) => {
if (!req.query.videoId) {
res.status(500).send({ err: "query param videoId required" });
return;
}
const { videoId } = req.query;
const stream = ytdl(videoId, {
filter: "audioonly",
quality: "highestaudio",
});
stream.on("info", (info, format) => {
res.setHeader("Content-Type", "audio/webm");
res.setHeader("Accept-Ranges", "bytes");
res.setHeader("Content-Length", format.contentLength);
stream.pipe(res);
});
stream.on("error", (error) => {
console.error("Error message: ", error.message + " | " + getDateTime());
res.status(500).send({ err: error.message });
return;
});
};
I am using ytdl library to fetch the tracks and stream the response. If there is any error from ytdl, I am responding back with 500
status code and error message. Seems OKAY? But there is a huge blunder that I did here.
To make things worse:
I recently added a health check endpoint which checks if there is any IP ban on my server from YouTube.
export const checkHealthV2 = async (req, res) => {
try {
const { videoId } = req.query;
const stream = ytdl(videoId, {
filter: "audioonly",
quality: "highestaudio",
});
stream.on("info", (info, format) => {
if (info.videoDetails.videoId !== videoId) {
return res.status(500).send({
success: false,
msg: "Response videoId not same as request",
data: info.videoDetails,
currentTime: getDateTime(),
});
} else {
return res.status(200).json({
success: true,
msg: "Working",
data: info.videoDetails.title,
currentTime: getDateTime(),
});
}
});
stream.on("error", (error) => {
console.error("Error message: ", error.message + " | " + getDateTime());
return res.status(500).send({ err: error.message });
});
} catch (error) {
//Handled error here
}
};
I was hitting this API through BetterStack monitoring every 3 minutes. After I deployed it, the server was going to 100% usages every 2-3 hours. Fed up with restarts, I decided to solve this problem once and for all.
Identifying the leak:
I started going through the tons of Stack-overflow answers and GPT discussions. Couldn’t find the exact issue that I was having. Then, I went through the NodeJs documentation for Streams.
Then, I realised the issue was with the handling of the streams. In my API, I was not properly closing the streams when I was encountering an error or when the client was closing the connection abruptly. This meant, the streams were left open which led to memory leaks and high CPU usage.
// HERE IS THE 🐞
stream.on("error", (error) => {
console.error("Error message: ", error.message + " | " + getDateTime());
return res.status(500).send({ err: error.message });
});
The unfinished streams blocked the memory and CPU and my server stoped responding. Restarting the instance solved the issue, but it occurred very soon.
The Fix:
After identifying the problem, I started looking for ways to properly handle and close the streams. I came across an important fact that NodeJs does not automatically close a stream when an error occurs or when the client closes the connection abruptly.
This is when I started using the stream.destroy()
method. This method forces the stream to close, releasing all the resources it was occupying.
Here is how I modified my API handler to properly close the streams:
- Streaming API:-
export const directStream = async (req, res) => {
...
stream.on("error", (error) => {
console.error("Error message: ", error.message + " | " + getDateTime());
res.status(500).send({ err: error.message });
stream.destroy(); // close the stream
return;
});
// listening to the 'close' event to make sure the stream is destroyed
req.on('close', () => {
stream.destroy();
});
};
- Health Check API:-
export const checkHealthV2 = async (req, res) => {
try {
let isSuccess = false;
let videoInfo = null;
//...existing code
stream.on("info", (info, format) => {
videoInfo = info;
if (info.videoDetails.videoId !== videoId) {
stream.emit("end");
} else {
isSuccess = true;
stream.emit("end");
}
});
stream.on("error", (error) => {
stream.destroy();
res.status(500).send({ err: error.message });
});
stream.on("end", () => {
if (isSuccess) {
res.status(200); // Handle success case
} else {
res.status(500); //Send error response
}
stream.destroy(); //Closes the stream
});
} catch (error) {
//...handle unexpected error
}
};
This resolved the issue of memory leaks and high CPU usage. My server stopped crashing and started working smoothly.
Lessons Learned:
Handling streams in NodeJs can be tricky and if not done correctly, can lead to serious issues like memory leaks and high CPU usage. Always make sure to properly close the streams when an error occurs or when the client closes the connection abruptly.
Also, it is always better to dig deeper into the problem rather than just applying quick fixes. This not only helps in finding the root cause of the problem but also helps in understanding how things work behind the hood and improving our coding skills.
I hope my experience and solution can help someone who might be facing a similar issue. I will try to drop more such use cases and the bugs.
Thanks for sticking around.
Samadrit
Subscribe to my newsletter
Read articles from Samadrit Sarkar directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by