Efficient Bulk File Upload and Processing with Chunked API Requests in Angular

sankati komalisankati komali
2 min read

#Angular #webdevelopment #javascript #API

1. Introduction

Handling bulk data uploads efficiently is a common requirement in enterprise applications. Recently, I worked on a feature that allowed users to upload an Excel file containing hundreds of members. The process involved validating the uploaded data and updating the system in chunks of 10 members at a time to avoid server timeouts and improve reliability.

In this blog, I’ll walk through the core concepts and strategies I used to solve this, without sharing any client-specific code. If you're building a similar feature in Angular or any frontend framework, you might find this helpful.

2. File Upload & Parsing

We started by allowing users to upload .xlsx or .csv files. Once uploaded, the file was read and parsed into a list of member objects.

For parsing Excel files on the frontend, you can use libraries like:

xlsx for reading Excel files

PapaParse for CSV files

Conceptual steps:

Read the uploaded file using a FileReader

Convert the sheet data to JSON

Store the result in an array for further validation\

3.Validating the Data

Before sending the data to the backend, it’s critical to validate it on the client side to prevent unnecessary API calls and ensure data quality.

Common validations included:

→ Required fields (e.g., email, name)

→ Valid formats (email, phone)

→ Duplicate entries

Invalid entries were collected into a separate list to show feedback to the user and allow corrections before submission.

4.Chunking the Data

Uploading hundreds of records in a single API call can cause performance issues, including:

→ Timeout errors

→ Server-side validation overload

→ Partial failures with no trace

To avoid this, we split the member list into smaller chunks (e.g., 10 members per chunk).

🔹 Chunking Strategy:

Here's a conceptual way to break an array into chunks:

function splitIntoChunks(array, chunkSize) {

const chunks = []; for (let i = 0; i < array.length; i += chunkSize) {

chunks.push(array.slice(i, i + chunkSize));

}

return chunks; }

This allowed us to break down a large member list into manageable pieces and send them one at a time.

5.Uploading Chunks to the Server

Once the member list was chunked, we used a loop to sequentially send each chunk to the server. This helps in controlling the flow and handling errors gracefully.

🔹 Sequential Processing For each chunk:

→ Call the backend API

→ Wait for the response

→ If successful, proceed to the next

→ If an error occurs, log it and optionally stop the process

→ This approach gave us better control over the flow and ensured that each batch was handled safely.

0
Subscribe to my newsletter

Read articles from sankati komali directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

sankati komali
sankati komali