Adaptive Video Bitrate Streaming with FFmpeg


Have you ever wondered why despite how terrible your bandwidth may be, you’ve somehow managed to stream a YouTube video and sat through an entire Netflix movie without it hanging? True, it gets a bit blurry occasionally but somehow without your assistance, your streams seem to have a mind of their own, sharpening up in some scenes and becoming barely visible in others. The reason for this wonderful feat is something called Adaptive Bitrate Streaming or ABS for short.
I was given a fascinating task at work to implement this concept in a project and truly began to appreciate the backend that powers everyday streaming platforms like YouTube and Netflix. Hopefully you will too as we dive head-first into the wonderful world of bitrate self-adjustment in video streaming using just a bash script with FFmpeg commands, HTML and an optional wrapper (in this example, TypeScript). Feel free to expand on what we do here.
What is Adaptive Bitrate Streaming?
Adaptive Bitrate Streaming (ABS) is concerned with encoding multiple renditions of the same video at varying bitrates and resolutions. The client (video player) dynamically switches streams depending on the strength of your network, bandwidth and device type. A good analogy for this is ordering food at a drive through. Picture having a perfectly working car that gives you no problems. You can park it and it stays, drive and it moves. With a car this good, you relax and order all the food you need because you know you can quickly drive off and keep the line moving. Now imagine your car is well…not great. It sputters continuously and you dread putting it to a halt because it may never start again. When you go to the drive through you’re not waiting five extra seconds after you park so you quickly order your food and get out of there before you unintentionally hold the line.
ABS allows for different video renditions (a lot of food or little food) depending on the bandwidth or device type (a great car or one that tries). You may notice the quality of your video change, but the video continues to play nonetheless so you don’t give it up and go touch some grass in frustration instead; no one wants that.
About FFmpeg
FFmpeg could be considered to be the embodiment of a jack of all trades in matters regarding video and audio handling. It is a multimedia framework which means that it is a collection of libraries and tools for handling video, audio and other multimedia streams. It encodes, decodes, transcodes, streams, filters and plays just about anything. We would be using it in our project to encode video into different formats and bitrates and generate manifests — files that show what streams are available and where media chunks are as well as the order in which they are played — Think of a manifest as a playlist. After you save up chunks (smaller segments of your video), a manifest lists them all. A master manifest list them alongside their various quality levels (240p, 480p, 720p) and points to their individual manifests. A typical manifest file for HTTP Live Streaming (HLS) has the extension .m3u8. For a Dynamic Adaptive Streaming over HTTP (DASH) file, it ends in .mpd. In this article, we would be using the HLS format.
Check out the differences between HLS and DASH here
Feature | HLS (HTTP Live Streaming) | DASH (Dynamic Adaptive Streaming over HTTP) |
---|---|---|
Origin | Created by Apple | Open standard |
Playlist/Manifest | .m3u8 files | .mpd files |
Device Support | Excellent on iOS/macOS, widely supported elsewhere | Most modern browsers/devices; not native on iOS Safari |
Latency | Typically higher, improving with Low-Latency HLS | Can be lower, flexible options |
In the next section you’ll be diving into the hands-on part of this article as you try out a basic Adaptive Bitrate Streaming example.
A Guide to Implementing Adaptive Bitrate Streaming
Implementing ABS involves a few steps. In this section you’ll understand each step involved which we’ll break down. Before starting, make sure FFmpeg is installed in your machine.
The Bash Script
This script is for creating the different video renditions. You’ll start by creating a file named encoder.sh, a bash script in a folder of your choice. For the purpose of this tutorial we would name that folder “VideoBiz“. In the bash file, type in the following code:
#!/bin/bash
INPUT=$1
OUTPUT_DIR=$2
mkdir -p $OUTPUT_DIR
# 240p
ffmpeg -i $INPUT -vf scale=-2:240 -c:a aac -ar 48000 -c:v h264 \
-b:v 400k -maxrate 400k -bufsize 800k -hls_time 6 -hls_playlist_type vod \
-f hls $OUTPUT_DIR/240p.m3u8
# 480p
ffmpeg -i $INPUT -vf scale=-2:480 -c:a aac -ar 48000 -c:v h264 \
-b:v 400k -maxrate 800k -bufsize 1200k -hls_time 6 -hls_playlist_type vod \
-f hls $OUTPUT_DIR/480p.m3u8
# 720p
ffmpeg -i $INPUT -vf scale=-2:720 -c:a aac -ar 48000 -c:v h264 \
-b:v 400k -maxrate 1500k -bufsize 1500k -hls_time 6 -hls_playlist_type vod \
-f hls $OUTPUT_DIR/720p.m3u8
# create Master Playlist
cat <<EOF > $OUTPUT_DIR/master.m3u8
#EXTM3U
#EXT-X-STREAM-INF:BANDWIDTH=400000,RESOLUTION=426X240
240p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=800000,RESOLUTION=854X480
480p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1500000,RESOLUTION=1280X720
720p.m3u8
EOF
Let’s go through each part of the code for better understanding. We’ll divide this script into three sections so you can follow along without getting lost. Each part serves a specific purpose and understanding them will make the whole process click. First, we start with:
#!/bin/bash
INPUT=$1
OUTPUT_DIR=$2
mkdir -p $OUTPUT_DIR
The first line tells your program that it is a bash script. INPUT=$1 identifies the input parameter of the script, which in this case is your input video. You can pick any video for this purpose, I’ll be using one named “jjk.mp4”. If you realize what video that might be, I really suggest we become friends. If not, proceed because it’s not important, just anime.
OUTPUT=$2 is a parameter for the output file. Finally, mkdir –p $OUTPUT_DIR creates the output directory, where your different manifests would be stored.
# 240p
ffmpeg -i $INPUT -vf scale=-2:240 -c:a aac -ar 48000 -c:v h264 \
-b:v 400k -maxrate 400k -bufsize 800k -hls_time 6 -hls_playlist_type vod \
-f hls $OUTPUT_DIR/240p.m3u8
This second part consists of FFmpeg commands. You could look up the FFmpeg documentation, but you may get a little headache as it is quite expansive. It serves as some sort of encyclopedia for the tool, and people don’t really go about reading whole encyclopedias at a go so it’s smart to look up particular commands you need. That said, the above are only a few of the commands in the FFmpeg arsenal.
-i stands for input. It accepts an input file. -vf scale renders the video at a particular scale which in this case is 240 pixels high (240p). -2: here acts to lock-in the width of the video.
-c:a aac sets Advanced Audio Coding (AAC) to be used as the codec for the audio. –c in the command stands for codec and :a is for audio. AAC is a lossy audio compression format also used on platforms like Netflix.
-ar 48000 sets the audio sampling rate which essentially means that it captures the sound at a rate of 48 kilohertz (kHz) per second. Think of it like beating a drum 48000 times every second.
–c:v h264 is a command similar to c:a but this time for encoding a video. Here the compression format set is H.264 (also known as Advanced Video Coding) which is the world’s most widely used lossy video compression.
–b:v 400k sets the average video bitrate to 400kbs (kilobits per second) where –b stands for bitrate and :v, video stream.
-maxrate and -bufsize control how “spiky” the bitrate is. In other words, they prevent the bitrate from going over a set threshold.
–hls_time 6 segments the video into chunks; 6 second chunks in this case while hls_playlist_type vod sets the hls playlist type to video on demand which is one of two types that control how the manifest is written, the second being event/live.
-hls $OUTPUT_DIR/240p.m3u8 outputs the manifest in HLS format.
The other blocks for 480p and 720p contain the same commands as explained here for 240p so you could go through them and make comparisons.
# create Master Playlist
cat <<EOF > $OUTPUT_DIR/master.m3u8
#EXTM3U
#EXT-X-STREAM-INF:BANDWIDTH=400000,RESOLUTION=426X240
240p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=800000,RESOLUTION=854X480
480p.m3u8
#EXT-X-STREAM-INF:BANDWIDTH=1500000,RESOLUTION=1280X720
720p.m3u8
EOF
The last part of the bash file consists of the master manifest. This contains content for all three manifests and points to them.
Next, open Git Bash or your bash terminal and in the folder that contains your video sample and encoder.sh file, run the following:
#!/bin/bash
chmod +x encoder.sh
./encoder.sh jjk.mp4 output_kaisen
What this does is make your bash file an executable and then runs the file. You will find some intimidating log of data on running this. This is just ffmpeg being chatty about what it’s doing. It simply processes the input file as determined by the requested codecs, segment sizes and bitrates in the bash script, generates chunks, the various manifests and the master manifest in a folder named output_kaisen in our example.
The Encoder File
The TypeScript file we create here would help to wrap the bash script for easy execution in Node.js. If you don’t use TypeScript, you can adapt the same logic to any language you prefer. Once you’ve finished executing the bash script and finding the output folder in your directory, you can then proceed to create it. We’ll creatively name it encoder.ts. Copy and paste the code below into your TypeScript file:
import { spawn } from "child_process";
import path from 'path';
const inputPath = path.join(__dirname, "jjk.mp4");
const outputPath = path.join(__dirname, "output_kaisen");
const process = spawn("bash", ["encoder.sh", inputPath, outputPath]);
process.stdout.on("data", data => {
console.log(`stdout: ${data}`);
});
process.stderr.on("data", data => {
console.error(`stderr: ${data}`);
});
process.on("close", code => {
console.log(`FFmpeg process exited with code: ${code}`)
});
This file isn’t doing anything new; in fact, it does exactly what the code below does which you have run before:
#!/bin/bash
chmod +x encoder.sh
./encoder.sh jjk.mp4 output_kaisen
However, it is good for when you’d like to re-run the encoder and not want to manually run the executable from the terminal. It is also good if you’re considering integrating into a bigger app such as a streaming service and acts as a wrapper for your bash script as we’ve mentioned before. Let’s go through each line of code to figure out what they do.
You begin by importing spawn from the child_process module and path from the path module. spawn is used for running terminal commands in Node.js. In this case we run the bash script encoder.sh and supply the arguments for the input and output. You save this in a variable “process” (you can call it whatever you want) and output data to the terminal (the intimidating message logs) using stdout. You can also log errors and close the process as shown in the code.
The HTML File
The last stage involves actually viewing our adaptive bitrate streams. We can do this with a HTML file and a video player for HLS:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>HLS Test</title>
<script src="https://cdn.jsdelivr.net/npm/hls.js@latest"></script>
</head>
<body>
<h1>HLS Streaming Test</h1>
<video id="video" controls autoplay style="width: 100%; max-width: 720px;"></video>
<script>
const video = document.getElementById('video');
const hlsSource = 'output_kaisen/master.m3u8';
if (Hls.isSupported()) {
const hls = new Hls();
hls.loadSource(hlsSource);
hls.attachMedia(video);
hls.on(Hls.Events.MANIFEST_PARSED, () => {
video.play();
});
} else if (video.canPlayType('application/vnd.apple.mpegurl')) {
video.src = hlsSource;
video.addEventListener('loadedmetadata', () => {
video.play();
});
} else {
console.error('HLS not supported in this browser.');
}
</script>
</body>
</html>
The code above creates a simple web page that plays an HLS stream (.m3u8 file). It uses Hls.js to make the stream work in browsers that don’t natively support HLS, while falling back to native playback in Safari or other HLS-capable browsers. It first checks if Hls.js is supported; if so, it loads the master.m3u8 manifest, attaches it to the <video> element, and starts playback once the manifest is parsed. If Hls.js isn’t available but the browser can handle HLS natively, it assigns the manifest directly to the video source and plays it. If neither method is supported, it logs an error to the console. In short, this page ensures that the HLS video stream can be played reliably across different browsers.
In order to see this in action, serve your project locally so you can access it through a browser. Paste the generated URL in your browser and watch your video play. To observe the chunks, open your browser’s Developer Tools and check the Network tab. It is advisable to disable caching in Developer Tools as you test so that your video won’t play from cached files and reflect real-time changes. Proceed to throttle the connection moving from 3G to 4G and so on and watch the magic of Adaptive Bitrate Streaming take place.
And just like that you’ve created the next competitor of Netflix and I can sense Greg Peters, the co-CEO of Netflix shaking in his boots. While that’s likely not true, this serves as a strong starting point for amazing products centered on media compression and ABS. Play around with the basics in this article and expand it to whatever limits you see fit.
Subscribe to my newsletter
Read articles from Amanda Ene Adoyi directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Amanda Ene Adoyi
Amanda Ene Adoyi
I am a Software Developer from Nigeria. I love learning about new technologies and talking about coding. I talk about coding so much that I genuinely think I could explode if I don't . I have been freelance writing for years and now I do the two things I love the most: writing and talking about tech.