Using MediaConvert Job Templates for HLS Transcoding in AWS


Creating MediaConvert job templates allows you to standardize and automate the transcoding process. By using job templates, you can easily manage multiple video conversion workflows without having to manually configure the same settings for each job. With AWS Lambda, you can automate the conversion process further, triggering transcoding jobs based on events such as new video uploads.
How to Create and Use a MediaConvert Job Template for HLS Streaming in AWS
In this tutorial, we will guide you through the process of setting up a MediaConvert job template for HLS (HTTP Live Streaming) and using it to convert video files stored in an S3 bucket into HLS format. AWS Elemental MediaConvert is a powerful service for video transcoding, and using job templates allows you to automate your video processing workflows.
Step 1: Create a MediaConvert Job Template
Before you can use AWS MediaConvert to convert video files to HLS, you must create a job template. A job template defines the settings MediaConvert will use for all video conversions processed through that template.
Here's how you can create a job template for HLS conversion:
Using AWS Management Console:
Navigate to the AWS Elemental MediaConvert console.
In the left navigation pane, select Job templates.
Click Create job template.
In the Name field, enter
HLS_MultiResolution_Template
(or any other name you prefer).Under Settings, configure your video and audio settings. Here's a sample configuration that converts a video into a multi resolution resolution HLS stream:
Input Settings: Define your input file.
Output Groups: Specify the HLS output settings.
Output Settings: Choose M3U8 as the container, and configure video and audio codecs (e.g., H.264 for video and AAC for audio).
Sample Job Template JSON Configuration:
Here is an example JSON configuration for the HLS_MultiResolution_Template
job template:
{
"Description": " output 1080p + 720p + 480p, each resolution into its own folder, and link all together into one master HLS playlist.",
"Category": "Media Convert templates",
"Name": "HLS_MultiResolution_Template",
"Settings": {
"TimecodeConfig": {
"Source": "ZEROBASED"
},
"OutputGroups": [
{
"Name": "Apple HLS",
"Outputs": [
{
"ContainerSettings": {
"Container": "M3U8",
"M3u8Settings": {}
},
"VideoDescription": {
"CodecSettings": {
"Codec": "H_264",
"H264Settings": {
"MaxBitrate": 8000000,
"RateControlMode": "QVBR",
"SceneChangeDetect": "TRANSITION_DETECTION"
}
}
},
"AudioDescriptions": [
{
"CodecSettings": {
"Codec": "AAC",
"AacSettings": {
"Bitrate": 96000,
"CodingMode": "CODING_MODE_2_0",
"SampleRate": 48000
}
}
}
],
"OutputSettings": {
"HlsSettings": {}
},
"NameModifier": "_converted"
}
],
"OutputGroupSettings": {
"Type": "HLS_GROUP_SETTINGS",
"HlsGroupSettings": {
"SegmentLength": 7,
"Destination": "s3://<BUCKET>/assets/videos_hls/",
"MinSegmentLength": 0,
"SegmentControl": "SINGLE_FILE"
}
}
}
],
"Inputs": [
{
"AudioSelectors": {
"Audio Selector 1": {
"DefaultSelection": "DEFAULT"
}
},
"VideoSelector": {},
"TimecodeSource": "ZEROBASED"
}
]
},
"AccelerationSettings": {
"Mode": "DISABLED"
},
"StatusUpdateInterval": "SECONDS_60",
"Priority": 0,
"HopDestinations": []
}
Once your template is created, you can use it to automate video transcoding to HLS format.
Using AWS CLI:
You can also create the template through the AWS CLI. Here’s the command to create a job template from the JSON configuration:
aws mediaconvert create-job-template --name HLS_720p_Template --region eu-west-1 --cli-input-json file://hls_720p_template.json
In this command, replace hls_720p_template.json
with the path to the JSON configuration file you created.
Step 2: Use the MediaConvert Job Template to Process Videos
Once the template is created, you can use it to process video files. In this example, we’ll use AWS Lambda to trigger a MediaConvert job when a new video file is uploaded to an S3 bucket.
Here’s an AWS Lambda function written in Node.js to create a MediaConvert job using the HLS_720p_Template
:
import { MediaConvert } from '@aws-sdk/client-mediaconvert';
export const handler = async (event) => {
const s3Bucket = event.Records[0].s3.bucket.name;
const s3Key = event.Records[0].s3.object.key;
try {
const mediaConvert = new MediaConvert({ region: 'eu-west-2' });
const data = await mediaConvert.createJob({
Role: 'arn:aws:iam::<your-account-id>:role/HLSPreProcessor',
Settings: {
Inputs: [
{
FileInput: `s3://${s3Bucket}/${s3Key}`,
},
],
},
JobTemplate: 'HLS_MultiResolution_Template', // Reference the template you created
});
console.log('MediaConvert Job created:', data.Job.Id);
return {
statusCode: 200,
body: JSON.stringify({
message: 'MediaConvert job submitted successfully',
jobId: data.Job.Id,
}),
};
} catch (err) {
console.error('Error creating MediaConvert job:', err);
return {
statusCode: 500,
body: JSON.stringify({
message: 'Failed to create MediaConvert job',
error: err.message,
}),
};
}
};
This function gets triggered whenever a new video is uploaded to the specified S3 bucket. It uses the MediaConvert API to create a transcoding job based on the HLS_MultiResolution_Template
you created earlier.
Make sure to configure your Lambda function with the appropriate IAM role that has permissions to invoke MediaConvert and access S3.
Step 3: Monitor and Retrieve the Job Status
After submitting the MediaConvert job, you can monitor the job status. Use the describe-job
API or check the job status in the AWS MediaConvert console. If the job is successful, your video will be converted and saved in the specified S3 bucket, ready to be served as HLS segments.
Step 4: Permission Management
When dealing with MediaConvert, S3, and Lambda, it's crucial to set up proper permissions to ensure security and prevent unauthorized access. Here are the key permission management steps for each service:
1. MediaConvert Permissions
To allow MediaConvert to perform transcoding jobs, the IAM role that MediaConvert assumes (e.g., HLSPreProcessor
) needs the following permissions:
IAM Role for MediaConvert (
HLSPreProcessor
):mediaconvert:CreateJob
mediaconvert:DescribeJob
s3:GetObject
(for reading input files from S3)s3:PutObject
(for writing output files to S3)
Here’s an example IAM policy for the role:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "mediaconvert:CreateJob",
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject"
],
"Resource": [
"arn:aws:s3:::<your-bucket-name>/*"
]
}
]
}
Make sure to replace <your-bucket-name>
with the actual name of your S3 bucket.
2. Lambda Permissions
The Lambda function will also need permissions to invoke MediaConvert and access S3 buckets:
IAM Role for Lambda (
LambdaExecutionRole
):mediaconvert:CreateJob
s3:GetObject
s3:PutObject
Example IAM policy for the Lambda execution role:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"mediaconvert:CreateJob",
"mediaconvert:DescribeJob"
],
"Resource": "*"
},
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject"
],
"Resource": "arn:aws:s3:::<your-bucket-name>/*"
}
]
}
3. S3 Bucket Permissions
Make sure your S3 bucket has the necessary permissions to allow Lambda and MediaConvert to read and write objects. The following S3 bucket policy can grant the appropriate permissions:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject"
],
"Resource": [
"arn:aws:s3:::<your-bucket-name>/*"
]
}
]
}
Step 2.5: Automating Multi-Resolution HLS Workflow with Lambda
In this module, you'll use Amazon S3 events and Lambda to automatically trigger AWS Elemental MediaConvert jobs. The ability to watch S3 folders and take action on incoming items is a useful automation technique that enables the creation of fully unattended workflows. In our case, the user will place/upload videos in a folder on AWS S3 which will trigger an ingest workflow to convert the video. We'll use the job definition from the previous modules as the basis for our automated job. Simply create a Lambda function that submits a MediaConvert job referencing a multi-resolution job template — such as HLS_MultiResolution_Template
.
Here’s an example AWS Lambda function (Node.js) for this automation:
import { MediaConvert } from '@aws-sdk/client-mediaconvert';
export const handler = async (event) => {
const s3Bucket = event.Records[0].s3.bucket.name;
const s3Key = event.Records[0].s3.object.key;
try {
const mediaConvert = new MediaConvert({ region: 'eu-west-2' });
const data = await mediaConvert.createJob({
Role: 'arn:aws:iam::897729107116:role/HLSPreProcessor',
Settings: {
Inputs: [
{
FileInput: `s3://${s3Bucket}/${s3Key}`,
},
],
},
JobTemplate: 'HLS_MultiResolution_Template', // <-- Use multi-res template
});
console.log('MediaConvert Job created:', data.Job.Id);
return {
statusCode: 200,
body: JSON.stringify({
message: 'MediaConvert job submitted successfully',
jobId: data.Job.Id,
}),
};
} catch (err) {
console.error({ s3Bucket, s3Key, 'Error creating MediaConvert job:': err });
return {
statusCode: 500,
body: JSON.stringify({
message: 'Failed to create MediaConvert job',
error: err.message,
}),
};
}
};
✅ Important:
Make sure yourHLS_MultiResolution_Template
is configured to output multiple renditions (e.g., 1080p, 720p, 480p) with different bitrate settings under the same HLS output group.
This will ensure that your masterplaylist.m3u8
references multiple variant playlists, enabling true adaptive bitrate streaming.
{
"Description": " output 1080p + 720p + 480p, each resolution into its own folder, and link all together into one master HLS playlist.",
"Category": "Media Convert templates",
"Name": "HLS_MultiResolution_Template",
"Settings": {
"TimecodeConfig": {
"Source": "ZEROBASED"
},
"OutputGroups": [
{
"Name": "Apple HLS",
"Outputs": [
{
"ContainerSettings": {
"Container": "M3U8",
"M3u8Settings": {}
},
"VideoDescription": {
"CodecSettings": {
"Codec": "H_264",
"H264Settings": {
"MaxBitrate": 8000000,
"RateControlMode": "QVBR",
"SceneChangeDetect": "TRANSITION_DETECTION"
}
}
},
"AudioDescriptions": [
{
"CodecSettings": {
"Codec": "AAC",
"AacSettings": {
"Bitrate": 96000,
"CodingMode": "CODING_MODE_2_0",
"SampleRate": 48000
}
}
}
],
"OutputSettings": {
"HlsSettings": {}
},
"NameModifier": "_converted"
}
],
"OutputGroupSettings": {
"Type": "HLS_GROUP_SETTINGS",
"HlsGroupSettings": {
"SegmentLength": 7,
"Destination": "s3://<your-bucket-name>/assets/videos_hls/",
"MinSegmentLength": 0,
"SegmentControl": "SINGLE_FILE"
}
}
}
],
"Inputs": [
{
"AudioSelectors": {
"Audio Selector 1": {
"DefaultSelection": "DEFAULT"
}
},
"VideoSelector": {},
"TimecodeSource": "ZEROBASED"
}
]
},
"AccelerationSettings": {
"Mode": "DISABLED"
},
"StatusUpdateInterval": "SECONDS_60",
"Priority": 0,
"HopDestinations": []
}
You also want to add an S3 trigger for PUT
events so that your AWS Lambda function automatically runs when a new video is uploaded to your S3 bucket.
1. Add the S3 Trigger in AWS Console (Simple Method)
Go to AWS Lambda Console → Select your Lambda function.
Click "Add trigger".
Select S3.
Choose your S3 bucket (where videos are uploaded).
Event type: Choose "PUT" (Object Created - Put).
Prefix (optional): You can set it to something like
assets/videos_hls/
if you want it to trigger only for a folder.Suffix (optional): You can restrict it to specific file types like
.mp4
(so only MP4 uploads trigger).
Example:
Setting | Value |
Event Type | PUT |
Prefix | (Optional) uploads/ |
Suffix | (Optional) .mp4 |
Click "Add".
2. Add the Permission (Automatically Done, But FYI)
When you add the trigger, AWS will automatically add a permission allowing S3 to invoke your Lambda.
It will look something like this under Resource-based policy:
{
"Effect": "Allow",
"Principal": {
"Service": "s3.amazonaws.com"
},
"Action": "lambda:InvokeFunction",
"Resource": "arn:aws:lambda:eu-west-2:<your-account-id>:function:your-lambda-function-name",
"Condition": {
"ArnLike": {
"AWS:SourceArn": "arn:aws:s3:::your-bucket-name"
}
}
}
3. Monitoring Lambda with CloudWatch Logs
When AWS Lambda functions run, they automatically send logs to Amazon CloudWatch Logs. This makes it easy to monitor executions, catch errors, and debug.
How CloudWatch Logging Works
Every time your Lambda function runs, AWS creates a log stream under a log group.
The log group is named after your Lambda function, for example:
/aws/lambda/your-lambda-function-name
Inside the log stream, you'll see:
START – Marks the beginning of the function execution.
Logs and console output – Anything you log using
console.log()
(Node.js) orprint()
(Python).END – Marks the end of execution.
REPORT – Summary including duration, memory usage, and billing details.
Example log output:
START RequestId: 1234abcd-56ef-78gh-90ij-klmnopqrstuv Version: $LATEST
Processing S3 upload event...
Successfully started MediaConvert job.
END RequestId: 1234abcd-56ef-78gh-90ij-klmnopqrstuv
REPORT RequestId: 1234abcd-56ef-78gh-90ij-klmnopqrstuv Duration: 2500 ms Billed Duration: 3000 ms Memory Size: 512 MB Max Memory Used: 120 MB
How to View Logs in CloudWatch
Go to the AWS Console → CloudWatch → Logs → Log groups.
Find the log group
/aws/lambda/your-lambda-function-name
.Click into the log streams to see individual invocation logs.
Adding Custom Logs in Lambda
You can add your own custom logs to help with debugging.
Node.js Example:
console.log(JSON.stringify({
level: "info",
message: "Started transcoding job",
jobId: mediaConvertJobId
}));
Setting Log Retention
By default, logs are retained forever unless you configure otherwise.
You can set a retention policy to automatically delete old logs and save costs:
Go to your Log Group in CloudWatch.
Click Actions → Edit Retention.
Set retention (e.g., 1 week, 1 month, 3 months).
Why CloudWatch Logging Is Important
Debugging: If your Lambda fails or behaves unexpectedly, CloudWatch gives you a complete trace of the event.
Monitoring: You can create metrics or alerts based on log patterns (e.g., detect if an error occurred).
Cost Management: Identifying inefficient code by reviewing memory usage and execution time.
Compliance: For systems needing audit trails, logs help demonstrate correct processing.
Later you can query these logs using CloudWatch Logs Insights.
Notes on HTTP Live Streaming (HLS): Overview, Definition, and Considerations
HTTP Live Streaming, or HLS, is a popular adaptive bitrate streaming protocol that was initially developed by Apple. Over time, it has become the standard for online video streaming across a variety of platforms. HLS dynamically adjusts the quality of a video stream based on the viewer’s internet speed, ensuring smooth playback regardless of network fluctuations.
HLS works by segmenting video files into smaller chunks, each lasting between two and twelve seconds. These segments are delivered via HTTP from the server to the video player. The player buffers several of these segments, allowing it to recover quickly in case of network interruptions. This buffering minimizes disruption, delivering a seamless viewing experience.
In this subsection, we’ll explore the following:
History of HLS Video Streaming
An Overview of the HLS Format
Top 5 Advantages of HLS
Using HLS Video Streaming for Live Content
Using HLS Video Streaming for On-Demand Content
Latency Considerations for HLS
HLS Video Streaming Compatibility
How to Scale HLS Video Streaming
History of HLS Video Streaming
Apple introduced HTTP Live Streaming (HLS) in 2009. Originally developed for iOS and macOS devices, HLS quickly became the go-to format for streaming video across the web. While it started as a proprietary protocol, HLS is now widely supported by platforms outside of the Apple ecosystem, including major streaming services like Hulu, YouTube, and Twitch.
Today, HLS is synonymous with scalable video streaming, used to broadcast live events to audiences around the world. For instance, Twitch leverages HLS to scale their live streams and deliver content to thousands of concurrent viewers. The scalability and compatibility of HLS have made it an essential tool for live streaming large-scale events.
An Overview of the HLS Format
HLS uses a unique approach to deliver video content, leveraging M3U8 playlists. These playlists, stored in text files with the .m3u8 extension, contain references to various versions (or renditions) of a video, enabling adaptive bitrate streaming. They define which video segments are available for streaming, as well as where the video segments can be found.
Typically, Transport Stream (.ts) files are used to package the individual video segments. These files are downloaded by the video player and played sequentially. HLS generally uses H.264 (video codec) and AAC (audio codec) to encode video content, which are widely supported across different platforms.
These video segments and M3U8 files are often distributed across a Content Delivery Network (CDN), which ensures that the content is cached close to the viewer, reducing latency and improving delivery speed.
Top 5 Advantages of HLS Video Streaming
HLS offers several advantages over other streaming protocols. Here are the top five:
Compatibility Across Devices: HLS is natively supported on a wide range of devices, including iOS, macOS, Android, and smart TVs.
Wide Availability of HLS Players: HLS can be played on most major platforms through players like Mux’s hls-video element, HLS.js, and Plyr.io.
Uses Existing Video Formats: HLS works with widely used formats such as MP4, TS, H.264, and AAC.
Improved User Experience: HLS dynamically adjusts video quality based on network speed, providing a smoother experience with minimal buffering.
Scalability: Since HLS works over HTTP, it does not require specialized servers, making it easier and more cost-effective to scale compared to other streaming protocols.
Using HLS Video Streaming for Live Content
HLS is an ideal choice for live streaming because it adapts in real time to changing network conditions, ensuring viewers experience consistent playback. It’s widely used for high-profile live events like sports broadcasts, gaming streams, and live auctions.
Examples of live content using HLS:
FIFA World Cup on BBC iPlayer
eSports tournaments on platforms like Twitch
Live auctions on YouTube, such as those hosted by Sotheby's
Using HLS Video Streaming for On-Demand Content
HLS is also the preferred format for on-demand video streaming, providing reliable playback for viewers who can choose when to watch a video. Platforms like Netflix, Hulu, and YouTube rely on HLS to deliver content to millions of viewers worldwide.
Examples of on-demand content using HLS:
Music videos on YouTube
TV series on Amazon Prime Video
Online courses on MasterClass.com
Latency Considerations for HLS
While HLS is reliable and scalable, it traditionally prioritizes stream consistency over latency, which can lead to higher delays. However, Apple introduced Low-Latency HLS (LL-HLS) to address this concern. LL-HLS reduces the delay in streaming by optimizing how segments are generated and delivered.
By adopting LL-HLS, streaming platforms can offer near-real-time live streams, improving the experience for viewers.
HLS Video Streaming Compatibility
One of the main reasons HLS is so popular is its compatibility across platforms and devices. From web browsers to mobile devices and smart TVs, HLS works seamlessly across many different environments. The table below outlines HLS compatibility across popular platforms:
Platform | HLS Support |
Web | Supported in major browsers like Safari, Chrome, Firefox, and Edge (via HTML5 players like Video.js or HLS.js) |
iOS | Natively supported in Safari |
Android | Supported via Google Exoplayer |
TV | Supported on devices like Roku, Apple TV, Amazon Fire TV, Xbox, PS4, Samsung, and LG |
How to Scale HLS Video Streaming
Scaling HLS streaming across multiple regions and devices can be a challenge due to the potential increase in latency and delivery costs. To address these issues, developers often use Content Delivery Networks (CDNs) to cache video segments closer to the end user, improving delivery speed and reducing strain on the origin server.
Because HLS uses HTTP, it can easily leverage existing CDN infrastructure, making it simpler to scale and cost-effective compared to other protocols. This is why CDNs are an integral part of streaming platforms that deliver high-quality video to global audiences.
This guide provides a comprehensive look at the technical aspects of HLS streaming, from understanding its format to addressing latency and scalability concerns. By leveraging HLS, developers can ensure reliable, adaptive video delivery across a wide range of devices and platforms, providing a seamless experience for their viewers.
Conclusion
Creating MediaConvert job templates allows you to standardize and automate the transcoding process. By using job templates, you can easily manage multiple video conversion workflows without having to manually configure the same settings for each job. With AWS Lambda, you can automate the conversion process further, triggering transcoding jobs based on events such as new video uploads.
Setting up the right IAM roles and permissions ensures that your workflows are secure and that only authorized entities can perform transcoding and access your resources.
We hope this tutorial has helped you get started with AWS MediaConvert, S3, Lambda, and HLS streaming. If you run into any issues, feel free to reach out for help!
This should now give a complete overview of how to create and use a MediaConvert job template with proper permission management for HLS streaming.
Subscribe to my newsletter
Read articles from Brian Kiplagat directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Brian Kiplagat
Brian Kiplagat
Welcome to my Hashnode blog, where I embark on an exhilarating journey through the dynamic realms of web development, programming languages, and operating systems. Join me as we delve into the interplay of Angular, React, PHP, Laravel, Java, and Linux, unlocking the limitless potential they offer to shape the digital landscape. Together, we will illuminate the path to innovation, bridging the gap between front-end magic, back-end wizardry, and robust system administration.