Chunk File Uploads with Laravel and React + InertiaJs
Recently, I faced an issue with AWS Lambda's 6MB quota limitation while using Laravel Vapor. Implementing chunk file uploads in Laravel paired with ReactJS became a simple solution to overcome this challenge. This approach allows large files to be divided into smaller pieces, which can then be uploaded sequentially and assembled back into a single file. In this blog, we'll explore how to set up chunk file uploads using Laravel, React Typescript, and InertiaJs.
Prerequisites;
PHP >= 8.1
Nodejs >= 18
Laravel >= 9
React 18 + Typescript
Inertia.js
Database - What you have. For me, used the SQLite
Create a model
php artisan make:model Upload
Create a migration for Upload
model
php artisan make:migration create_uploads_table
# after complete the files, run `php artisan migrate` command
The migration should be like this;
public function up()
{
Schema::create('uploads', function (Blueprint $table) {
$table->id();
$table->string('path');
$table->timestamps();
});
}
Now, create an initial page with the inertia()
helper.
namespace App\Http\Controllers;
use App\Http\Controllers\Controller;
use App\Models\Upload;
class ShowUploadsController extends Controller
{
public function __invoke()
{
return inertia('uploads', [
'uploads' => Upload::get(),
]);
}
}
Create a react component named Uploads
in the resources/js/pages
import React, { ChangeEvent } from 'react';
import { router as Inertia } from '@inertiajs/react';
import axios from 'axios';
import route from 'ziggy-js';
interface Upload {
path: string;
}
interface Prop {
uploads: Upload[];
}
export default function Uploads({ uploads }: Prop) {
async function chunkFileAndUpload(file: File) {
try {
if (file.size > 100 * 1024 * 1024) {
throw new Error('The file must be less than 100 MB');
}
const chunkSize = 1024 * 1024 * 4;
const chunks = Math.ceil(file.size / chunkSize);
for (let i = 0; i < chunks; i++) {
const start = i * chunkSize;
const end = Math.min(file.size, start + chunkSize);
const chunk = file.slice(start, end);
const formData = new FormData();
formData.append('file', chunk, file.name);
formData.append('is_last_chunk', i === chunks - 1);
await axios.post(route('uploads.chunk'), formData);
}
Inertia.reload();
} catch (e: any) {
alert(e.message ?? 'Failed!');
}
}
function handleFileUpload(e: ChangeEvent<HTMLInputElement>) {
e.preventDefault();
const { files } = e.target;
if (!files) {
return;
}
const file: File = files[0];
chunkFileAndUpload(file);
}
return (
<div>
<input type={'file'} onChange={e => handleFileUpload(e)} />
<ul>
{uploads.map(upload => (
<li key={upload.path}>{upload.path}</li>
))}
</ul>
</div>
);
}
Uploads.layout = (page: any) => <div children={page} />;
The chunkFileAndUpload
starts by checking if the file size is larger than 100 megabytes. You can adjust it to fit your needs. Then, the function sets a chunkSize
of 4 megabytes (4 x 1024 x 1024 bytes). This determines how big each part of the file will be when it's split up for uploading. Make sure it's not bigger than 6 megabytes and also consider the other body request values (needed to be a room).
Now, let's implement backend logic;
First, we need a support class and i created
ChunkUploadService
intoApp\Support
directory.namespace App\Support; use Illuminate\Http\UploadedFile; use Illuminate\Support\Facades\File; use Illuminate\Support\Facades\Storage; class ChunkUploadService { protected string $chunkPath; public function __construct( public UploadedFile $file, private readonly bool $isLastChunk, private readonly string $documentPath ) { if (!Storage::disk('local')->exists('chunks')) { Storage::disk('local')->makeDirectory('chunks'); } // Instead, you can create a unique hash or name and uuid to avoid the file to override. $tempFileName = $file->getClientOriginalName(); $this->chunkPath = Storage::disk('local')->path( "chunks/{$tempFileName}" ); } public function merge(): string|null { File::append($this->chunkPath, $this->file->get()); if (!$this->isLastChunk) { return null; } $path = $this->documentPath . '/' . $this->file->getClientOriginalName(); Storage::put($path, File::get($this->chunkPath)); $this->deleteChunk(); return $path; } public function deleteChunk(): void { File::delete($this->chunkPath); } }
Here is the controller.
namespace App\Http\Controllers;
use App\Http\Controllers\Controller;
use App\Models\Upload;
use App\Support\ChunkUploadService;
class ChunkUploadController extends Controller
{
public function __invoke()
{
$data = request()->validate([
'is_last_chunk' => ['required', 'boolean'],
]);
$chunkupload = new ChunkUploadService(
file: request()->file,
isLastChunk: $data['is_last_chunk'],
documentPath: 'files'
);
try {
$path = $chunkupload->merge();
// if the last chunk, store the uploaded file
if ($path) {
Upload::create([
'path' => $path,
]);
return response()->json([
'complete' => true,
]);
}
return response()->json([
'complete' => false,
]);
} catch (\Exception $e) {
$chunkupload->deleteChunk();
throw $e;
}
}
}
Update your routes/web.php
, and we need two routes.
use App\Http\Controllers\ShowUploadsController;
use App\Http\Controllers\ChunkUploadController;
Route::get('uploads', ShowUploadsController::class)->name('upload.index');
Route::post('uploads-chunk', ChunkUploadController::class)->name('uploads.chunk');
That is it. Please see your storage and ensure that the chunks are combined into one!
In conclusion, the pivotal logic is wrapped into the ChunkUploadService
and ChunkUploadController
, where files are efficiently managed and processed in manageable fragments. By following the step-by-step guide outlined above, you can seamlessly implement chunk file uploads in your Laravel application, regardless of file size.
Additionally, this implementation is designed to be compatible with your chosen frontend framework, be it Vue, Svelte, or Vanilla Js. This flexibility extends to your choice of CSS framework, ensuring a smooth and consistent integration experience.
Subscribe to my newsletter
Read articles from Tuvshinjargal Byambajav directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by