Running Multiple Background Jobs in ASP.NET Core (Channel + Worker Model)


In the previous blog, we discussed how to handle a single long-running task using BackgroundService
in ASP.NET Core. This time, we’ll explore how to handle multiple long-running tasks concurrently, process them in the background, and track their progress in real-time.
Problem Statement
Imagine an application where you need to process multiple heavy jobs simultaneously — for example:
Excel file imports
Video processing
Large file uploads
Email campaigns
If we process these tasks directly in API requests, users will experience timeouts. The solution is to use a background job queue with multiple workers.
Test Data Setup
For this demo, we prepared 5 Excel files named MOCK_DATA_1.xlsx
to MOCK_DATA_5.xlsx
. Each file contains 200 rows of dummy data with columns like Id
, FirstName
, LastName
, Email
, Gender
, and IpAddress
. All files are placed inside the Uploads
folder.
We generated this dummy dataset using Mockaroo, an online tool for creating realistic test data.
If you’d like to follow along, you can use Mockaroo to generate a similar dataset, or create your own with the same structure.
The background workers will pick these files from this folder and process them row by row.
Architecture Overview
We implemented a system following the Channel + Worker Model, which is well-suited for handling multiple background jobs concurrently. In this model:
Channel acts as the central job queue where producers (like API calls) can safely enqueue jobs.
Workers are background services that continuously read from the channel and process jobs.
Our implementation includes the following components:
Job Queue – Holds all jobs (the Channel).
Worker Service – Multiple background workers consume jobs from the queue.
Task Processor – Handles the actual work for each job (in our example, importing Excel rows).
Jobs API Controller – Enqueues new jobs and allows tracking their status.
Data flow looks like this:
Client → JobsController → JobQueue (Channel) → Workers → TaskProcessor → Database
Note: Excel import is used here as an example. You can replace
TaskProcessor
with any long-running operation such as sending emails, report generation, or payment processing.
Code Overview
Background Worker Service
JobWorkerService
spins up N workers. Each worker continuously reads from the queue and processes jobs.
public sealed class JobWorkerService : BackgroundService
{
private readonly IJobQueue _queue;
private readonly IJobStore _store;
private readonly IServiceScopeFactory _scopeFactory;
private readonly WorkerOptions _opts;
private List<Task>? _workers;
private readonly IWebHostEnvironment _env;
public JobWorkerService(
IJobQueue queue,
IJobStore store,
IServiceScopeFactory scopeFactory,
IOptions<WorkerOptions> opts,
IWebHostEnvironment env)
{
_queue = queue;
_store = store;
_scopeFactory = scopeFactory;
_env = env;
_opts = opts.Value;
}
protected override Task ExecuteAsync(CancellationToken stoppingToken)
{
var n = Math.Max(1, _opts.WorkerCount);
Console.WriteLine($"[WorkerService] Starting {n} workers...");
_workers = Enumerable.Range(1, n)
.Select(i => Task.Run(() => WorkerLoop(i, stoppingToken), stoppingToken))
.ToList();
return Task.CompletedTask;
}
private async Task WorkerLoop(int workerId, CancellationToken ct)
{
Console.WriteLine($"[Worker-{workerId}] Loop started.");
try
{
await foreach (var job in _queue.ReadAllAsync(ct))
{
try
{
Console.WriteLine($"[Worker-{workerId}] Picked job ({job.FileName}) at {DateTime.Now}");
_store.MarkStarted(job.JobId, $"Worker-{workerId} started");
// Locate file
var filePath = Path.Combine(_env.ContentRootPath, "Uploads", job.FileName);
using var scope = _scopeFactory.CreateScope();
var processor = scope.ServiceProvider.GetRequiredService<ExcelProcessor>();
var sw = System.Diagnostics.Stopwatch.StartNew();
// Import
var imported = await processor.ImportAsync(job.JobId, filePath, ct);
sw.Stop();
_store.MarkSucceeded(job.JobId, $"Imported {imported} rows");
Console.WriteLine($"[Worker-{workerId}] {job.FileName} → Completed in {sw.Elapsed.Seconds}s");
}
catch (Exception ex)
{
_store.MarkFailed(job.JobId, ex.Message);
Console.WriteLine($"[Worker-{workerId}] Job {job.JobId} FAILED: {ex.Message}");
}
}
}
catch (OperationCanceledException)
{
Console.WriteLine($"[Worker-{workerId}] Stopping due to cancellation.");
}
}
public override async Task StopAsync(CancellationToken cancellationToken)
{
Console.WriteLine("[WorkerService] Stopping workers...");
if (_workers is not null)
await Task.WhenAll(_workers);
await base.StopAsync(cancellationToken);
Console.WriteLine("[WorkerService] All workers stopped.");
}
}
Job Queue
The job queue forms the Channel part of the Channel + Worker Model. We used System.Threading.Channels
to safely enqueue and dequeue jobs in a thread-safe, producer-consumer manner.
public sealed class JobDto
{
public Guid JobId { get; init; } = Guid.NewGuid();
public required string UserId { get; init; }
public required string FileName { get; init; }
[JsonConverter(typeof(JsonStringEnumConverter))]
public JobStatus Status { get; set; } = JobStatus.Queued;
public string? Message { get; set; }
public DateTimeOffset CreatedAt { get; init; } = DateTimeOffset.UtcNow;
public DateTimeOffset? StartedAt { get; set; }
public DateTimeOffset? CompletedAt { get; set; }
}
public interface IJobQueue
{
ValueTask EnqueueAsync(JobDto job, CancellationToken ct = default);
IAsyncEnumerable<JobDto> ReadAllAsync(CancellationToken ct);
}
public sealed class ChannelJobQueue : IJobQueue
{
private readonly Channel<JobDto> _channel;
public ChannelJobQueue(IOptions<WorkerOptions> opts)
{
var cap = Math.Max(1, opts.Value.ChannelCapacity);
_channel = Channel.CreateBounded<JobDto>(new BoundedChannelOptions(cap)
{
SingleReader = false,
SingleWriter = false,
FullMode = BoundedChannelFullMode.Wait
});
}
public ValueTask EnqueueAsync(JobDto job, CancellationToken ct = default)
{
return _channel.Writer.WriteAsync(job, ct);
}
public async IAsyncEnumerable<JobDto> ReadAllAsync(
[System.Runtime.CompilerServices.EnumeratorCancellation] CancellationToken ct)
{
while (await _channel.Reader.WaitToReadAsync(ct))
{
while (_channel.Reader.TryRead(out var job))
yield return job;
}
}
}
This ensures that:
Multiple producers (e.g., multiple API requests) can enqueue jobs concurrently.
Multiple consumers (background workers) can safely process jobs in parallel.
The queue (Channel) handles backpressure using bounded capacity.
Job Status
We have an enum for job status:
public enum JobStatus
{
Queued,
Processing,
Succeeded,
Failed
}
Task Processor
Our ExcelProcessor
inserts rows into the database. To simulate heavy work, we added await Task.Delay(500)
after inserting each row in the database:
public sealed class ExcelProcessor
{
private readonly IServiceScopeFactory _scopeFactory;
public ExcelProcessor(IServiceScopeFactory scopeFactory)
{
_scopeFactory = scopeFactory;
}
public async Task<int> ImportAsync(Guid jobId, string filePath, CancellationToken ct)
{
if (!File.Exists(filePath))
throw new FileNotFoundException("Excel not found", filePath);
var sw = System.Diagnostics.Stopwatch.StartNew();
using var workbook = new XLWorkbook(filePath);
var ws = workbook.Worksheet("Data"); // sheet named "Data"
var rows = ws.RangeUsed().RowsUsed().Skip(1); // skip header
using var scope = _scopeFactory.CreateScope();
var db = scope.ServiceProvider.GetRequiredService<AppDbContext>();
var total = 0;
foreach (var r in rows)
{
ct.ThrowIfCancellationRequested();
var item = new ImportedRow
{
JobId = jobId,
Id = r.Cell(1).GetValue<int>(),
FirstName = r.Cell(2).GetString(),
LastName = r.Cell(3).GetString(),
Email = r.Cell(4).GetString(),
Gender = r.Cell(5).GetString(),
IpAddress = r.Cell(6).GetString()
};
db.ImportedRows.Add(item);
await db.SaveChangesAsync(ct);
total++;
await Task.Delay(500); // simulate long processing
}
sw.Stop();
return total;
}
}
Step-by-Step Demo
Step 1: Start the Application
Run the API. In the console, you’ll see:
As we have maximum of 3 workers configured:
public sealed class WorkerOptions
{
public const string Section = "Workers";
public int WorkerCount { get; init; } = 3;
public int ChannelCapacity { get; init; } = 100;
}
Step 2: Enqueue Jobs via /simulate
We created a /simulate
endpoint that enqueues 5 Excel jobs. The system has 3 workers, so 3 jobs start immediately, and 2 remain queued.
Hitting the /jobs
endpoint during processing shows:
3 jobs Processing
2 jobs Queued
[
{
"jobId": "84aba87d-aa21-4125-b7a6-e1fb4a24e7c5",
"userId": "User 5",
"fileName": "MOCK_DATA_5.xlsx",
"status": "Queued",
"message": null,
"createdAt": "2025-08-18T12:30:43.7107551+00:00",
"startedAt": null,
"completedAt": null
},
{
"jobId": "1450233a-322e-4af4-adab-49113932c5e1",
"userId": "User 4",
"fileName": "MOCK_DATA_4.xlsx",
"status": "Queued",
"message": null,
"createdAt": "2025-08-18T12:30:43.7107544+00:00",
"startedAt": null,
"completedAt": null
},
{
"jobId": "7965597a-d306-4acc-9edc-99cff6e73374",
"userId": "User 3",
"fileName": "MOCK_DATA_3.xlsx",
"status": "Processing",
"message": "Worker-1 started",
"createdAt": "2025-08-18T12:30:43.7107532+00:00",
"startedAt": "2025-08-18T12:30:43.7268583+00:00",
"completedAt": null
},
{
"jobId": "13549d31-ec76-4231-a7b4-e4505fe2801b",
"userId": "User 2",
"fileName": "MOCK_DATA_2.xlsx",
"status": "Processing",
"message": "Worker-2 started",
"createdAt": "2025-08-18T12:30:43.7106414+00:00",
"startedAt": "2025-08-18T12:30:43.7268715+00:00",
"completedAt": null
},
{
"jobId": "aa55681d-5ff4-4e83-9369-83f41ab2bd1e",
"userId": "User 1",
"fileName": "MOCK_DATA_1.xlsx",
"status": "Processing",
"message": "Worker-3 started",
"createdAt": "2025-08-18T12:30:43.7087232+00:00",
"startedAt": "2025-08-18T12:30:43.7269524+00:00",
"completedAt": null
}
]
Step 3: Wait for Completion
After all rows are processed (simulated with Task.Delay(500)
per row), the console’s output will be like:
By observing the Completed timestamps, we can see that workers processed jobs in parallel.
Hitting /jobs
again shows all jobs succeeded.
[
{
"jobId": "84aba87d-aa21-4125-b7a6-e1fb4a24e7c5",
"userId": "User 5",
"fileName": "MOCK_DATA_5.xlsx",
"status": "Succeeded",
"message": "Imported 200 rows",
"createdAt": "2025-08-18T12:30:43.7107551+00:00",
"startedAt": "2025-08-18T12:32:41.3426324+00:00",
"completedAt": "2025-08-18T12:34:34.7432123+00:00"
},
{
"jobId": "1450233a-322e-4af4-adab-49113932c5e1",
"userId": "User 4",
"fileName": "MOCK_DATA_4.xlsx",
"status": "Succeeded",
"message": "Imported 200 rows",
"createdAt": "2025-08-18T12:30:43.7107544+00:00",
"startedAt": "2025-08-18T12:32:40.8184769+00:00",
"completedAt": "2025-08-18T12:34:34.3683501+00:00"
},
{
"jobId": "7965597a-d306-4acc-9edc-99cff6e73374",
"userId": "User 3",
"fileName": "MOCK_DATA_3.xlsx",
"status": "Succeeded",
"message": "Imported 200 rows",
"createdAt": "2025-08-18T12:30:43.7107532+00:00",
"startedAt": "2025-08-18T12:30:43.7268583+00:00",
"completedAt": "2025-08-18T12:32:42.0638686+00:00"
},
{
"jobId": "13549d31-ec76-4231-a7b4-e4505fe2801b",
"userId": "User 2",
"fileName": "MOCK_DATA_2.xlsx",
"status": "Succeeded",
"message": "Imported 200 rows",
"createdAt": "2025-08-18T12:30:43.7106414+00:00",
"startedAt": "2025-08-18T12:30:43.7268715+00:00",
"completedAt": "2025-08-18T12:32:40.8088294+00:00"
},
{
"jobId": "aa55681d-5ff4-4e83-9369-83f41ab2bd1e",
"userId": "User 1",
"fileName": "MOCK_DATA_1.xlsx",
"status": "Succeeded",
"message": "Imported 200 rows",
"createdAt": "2025-08-18T12:30:43.7087232+00:00",
"startedAt": "2025-08-18T12:30:43.7269524+00:00",
"completedAt": "2025-08-18T12:32:41.3423034+00:00"
}
]
Database table now contains all imported rows.
All the 1000 rows have been inserted into the database.
Key Takeaways
Multiple workers allow concurrent background processing.
Channel<T>
safely manages job queueing.This system can handle any long-running task, not just Excel imports.
Real-time job tracking is easy via
/jobs
endpoint.
Conclusion
This approach scales from single-job (our previous blog) to multi-job scenarios. By combining BackgroundService, Channels, and worker tasks, you can process multiple heavy jobs efficiently without blocking API requests.
Source Code
Source code is available on GitHub.
Subscribe to my newsletter
Read articles from Waleed Naveed directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
