WebApp : Building AWS S3 Bucket File Uploader - A Hands-On Guide

Introduction

In today’s digital age, storing files in the cloud is a must. AWS S3 offers a secure and scalable storage solution, and combining it with AWS SDK for Python (Boto3) in Flask and a bit of front-end magic gives you a lightweight S3 file uploader that can be integrated into any web project. Whether you’re a developer looking to expand your toolkit or a tech enthusiast wanting a hands-on project, this guide is for you.

What You’ll Build

We will create a web application that:

  • Allows users to select and upload files.

  • Uses JavaScript (with progress events) to display upload progress.

  • Uses AWS SDK for Python (Boto3) in Flask as the backend to communicate with AWS S3.

  • Displays a list of uploaded files retrieved from S3.

  • Shows the bucket name to ensure that the correct S3 bucket is being used.

Files in the Project:

  • app.py – The main Python Flask application.

  • index.html – The front-end HTML interface.

  • script.js – JavaScript for handling uploads and dynamic updates.

  • style.css – CSS for styling the app.

  • .env – Environment variables for AWS credentials and bucket name.

Key components:

  1. Client-facing UI with drag-and-drop capabilities to upload Files

  2. Real-time file list synchronization. List uploaded Files to S3 Bucket

  3. S3File Upload Progress Bar

  4. Error Handling

Settings up file/folder structure in VS Code

Pre-requisites:

  1. AWS Account and Access to AWS Management Console.

  2. S3 Bucket - Create a unique named S3 Bucket from AWS Console.

  3. Create Access Key for newly Create “s3-uploader-user” IAM User (Screenshot Below)

  4. “s3-uploader-user” permission access policy AmazonS3FullAccess

  5. Installed Python , VSCode on your local machine.

Pre-requisites 2 , 3 Screenshot references

Note: AmazonS3FullAccess is not necessary. You can only give required granular permissions only as well.

Setting Up the Backend (app.py)

Below is the code for our Python app that’s uses AWS SDK for Python (Boto3).

This script creates several routes:

  • / to serve the main page.

  • /upload (and /api/upload) to handle file uploads.

  • /list to retrieve the list of files stored in S3.

  • /bucket to return the bucket name for verification.

import os
from flask import Flask, render_template, request, jsonify
import boto3
from botocore.exceptions import ClientError

app = Flask(__name__)

# Load AWS credentials and bucket name from environment variables.
AWS_ACCESS_KEY = os.environ.get('AWS_ACCESS_KEY_ID')
AWS_SECRET_KEY = os.environ.get('AWS_SECRET_ACCESS_KEY')
BUCKET_NAME = os.environ.get('AWS_S3_BUCKET') or 'your-bucket-name'

# Create an S3 client using the boto3 library.
s3_client = boto3.client(
    's3',
    aws_access_key_id=AWS_ACCESS_KEY,
    aws_secret_access_key=AWS_SECRET_KEY
)

# Home route renders the index.html template.
@app.route("/")
def index():
    return render_template("index.html")

# File upload route.
@app.route("/upload", methods=["POST"])
def upload():
    if "file" not in request.files:
        return jsonify({"error": "No file part in the request"}), 400

    file = request.files["file"]
    if file.filename == "":
        return jsonify({"error": "No selected file"}), 400

    try:
        # Upload the file to the specified S3 bucket.
        s3_client.upload_fileobj(file, BUCKET_NAME, file.filename)
        return jsonify({"message": "File uploaded successfully", "file": file.filename})
    except ClientError as e:
        return jsonify({"error": str(e)}), 500

# List files stored in the S3 bucket.
@app.route("/list", methods=["GET"])
def list_files():
    try:
        response = s3_client.list_objects_v2(Bucket=BUCKET_NAME)
        files = [obj["Key"] for obj in response.get('Contents', [])]
        return jsonify({"files": files})
    except ClientError as e:
        return jsonify({"error": str(e)}), 500

# Return the bucket name.
@app.route('/bucket', methods=['GET'])
def get_bucket_name():
    return jsonify({"bucketName": BUCKET_NAME})

# A second API endpoint for uploading (if needed).
@app.route('/api/upload', methods=['POST'])
def api_upload():
    if "file" not in request.files:
        return jsonify({"error": "No file part in the request"}), 400

    file = request.files["file"]
    if file.filename == "":
        return jsonify({"error": "No selected file"}), 400

    try:
        s3_client.upload_fileobj(file, BUCKET_NAME, file.filename)
        return jsonify({"message": "File uploaded successfully", "file": file.filename})
    except ClientError as e:
        return jsonify({"error": str(e)}), 500

if __name__ == "__main__":
    app.run(debug=True)

Key Points

  • AWS Credentials: The app uses environment variables (.env file) to securely store AWS keys and the bucket name.

  • S3 Client: The boto3 client is initialized with your credentials, allowing file uploads to S3.

  • Routes:

    • The /upload endpoint checks for a file in the request and uploads it using upload_fileobj.

    • The /list endpoint retrieves all objects (files) in the bucket.

    • The /bucket endpoint returns the bucket name, which is useful for debugging and ensuring your app is connected to the correct S3 bucket.

Front-End Implementation: index.html & script.js

The front end of our app provides a user-friendly interface for uploading files and viewing the list of uploaded files. Although the full index.html isn’t listed in the original snippet, here’s an example based on the provided JavaScript code.

index.html

<!DOCTYPE html>
<html lang="en">
<head>
    <meta charset="UTF-8">
    <title>AWS S3 File Upload</title>
    <link rel="stylesheet" href="style.css">
</head>
<body>
    <header>
        <img src="logo.png" alt="App Logo">
        <!-- Your App / Brand Logo for this Web App Reference as .png file with its location -->
        <h1>Upload Your Files to AWS S3</h1>
    </header>

    <div id="bucketDisplay">Bucket Name: Loading...</div>

    <!-- File Upload Form -->
    <form id="uploadForm">
        <input type="file" id="fileInput" name="file">
        <button type="submit">Upload File</button>
    </form>

    <!-- Progress Bar -->
    <div id="progressContainer">
        <div id="progressBar"></div>
    </div>

    <!-- Files List Table -->
    <table id="filesTable">
        <thead>
            <tr>
                <th>Uploaded Files</th>
            </tr>
        </thead>
        <tbody id="filesBody">
            <!-- JavaScript will dynamically insert file names here -->
        </tbody>
    </table>

    <script src="script.js"></script>
</body>
</html>

Code Explanation for index.html

  • Header: Displays a logo and title. Replace logo.png with your actual logo.

  • Bucket Display: A div with the id bucketDisplay will show the name of your S3 bucket (retrieved via the /bucket API).

  • File Upload Form:

    • The form with id="uploadForm" contains an <input> for file selection.

    • A submit button to trigger the upload.

  • Progress Bar:

    • Wrapped inside a container (#progressContainer) with an inner div (#progressBar) that visually represents the upload progress.
  • Files Table:

    • A table with an id filesTable is set up to display the list of uploaded files. The body (#filesBody) is populated dynamically via JavaScript.
  • Script Inclusion: The script.js file is included at the end of the body to handle the dynamic functionality.

Diving Into script.js

The JavaScript file is responsible for:

  • Handling the file upload via AJAX.

  • Displaying the upload progress.

  • Dynamically listing the files stored in the S3 bucket.

  • Fetching and displaying the S3 bucket name.

script.js

const uploadForm = document.getElementById('uploadForm');
const fileInput = document.getElementById('fileInput');
const progressBar = document.getElementById('progressBar');
const filesBody = document.getElementById('filesBody'); // Table body for files

// Function to list files from the S3 bucket.
function listFiles() {
    fetch('/list')
        .then(response => response.json())
        .then(data => {
            filesBody.innerHTML = ''; // Clear any previous file entries
            if (data.files) {
                data.files.forEach(file => {
                    const row = document.createElement('tr');
                    const cell = document.createElement('td');
                    cell.textContent = file;
                    row.appendChild(cell);
                    filesBody.appendChild(row);
                });
            }
        });
}

// Function to fetch and display the bucket name.
function fetchBucketName() {
    fetch('/bucket')
        .then(response => response.json())
        .then(data => {
            document.getElementById('bucketDisplay').textContent =
                "Bucket Name: " + data.bucketName;
        })
        .catch(console.error);
}

// Event listener for the upload form.
uploadForm.addEventListener('submit', function (e) {
    e.preventDefault();
    const file = fileInput.files[0];
    if (!file) return;

    const formData = new FormData();
    formData.append("file", file);

    const xhr = new XMLHttpRequest();
    xhr.open("POST", "/upload", true);

    // Update the progress bar during the file upload.
    xhr.upload.addEventListener("progress", function (evt) {
        if (evt.lengthComputable) {
            const percentComplete = (evt.loaded / evt.total) * 100;
            progressBar.style.width = percentComplete + '%';
        }
    }, false);

    // Once upload is complete, reset the progress bar and update the file list.
    xhr.onload = function () {
        progressBar.style.width = '0%';
        if (xhr.status === 200) {
            listFiles(); // Refresh file list upon successful upload.
        } else {
            alert("Error uploading file: " + xhr.responseText);
        }
    };

    xhr.send(formData);
});

// On window load, initialize the file list and bucket name display.
window.onload = function() {
    listFiles();
    fetchBucketName();
};

Key Points

  • File Upload Handling:

    • The script prevents the default form submission behavior.

    • It creates a FormData object to package the file data.

    • An XMLHttpRequest is used to post the file to the /upload endpoint.

  • Progress Bar:

    • The xhr.upload.addEventListener("progress", ...) event calculates the percentage of the upload and adjusts the width of the progress bar accordingly.
  • Dynamic Content:

    • listFiles() fetches the list of files from the S3 bucket and dynamically updates the table in the HTML.

    • fetchBucketName() retrieves the bucket name from the server to display it for user confirmation.

  • User Experience:

    • The upload progress, immediate file list update after a successful upload, and bucket name display all contribute to a seamless user experience.

Environment Setup (.env)

Make sure to create a .env file (or set environment variables in your deployment environment) with the following content:

AWS_ACCESS_KEY_ID=your-access-key
AWS_SECRET_ACCESS_KEY=your-secret-key
AWS_S3_BUCKET=file-upload-s3-jpatel-aws-bucket

Replace the placeholders with your actual AWS credentials and bucket name.

Running the App

  1. Install Dependencies:
    Make sure you have Flask and boto3 installed:

     pip install flask boto3 python-dotenv
    

  2. Set Up AWS Credentials:
    Ensure your AWS credentials and bucket name are properly configured in your .env file or environment.

  3. Run the S3Upload App locally:
    Start the application by running:

     python app.py
    

    Your application should now be accessible at Localhost aka http://127.0.0.1:5000/
    Note: if above two python commands runs successfully, your application will be shown on the above localhost address. Meaning, your application is working.

Wooaallaaa !!! Open your App successfully running on http://127.0.0.1:5000/

Conclusion

We’ve just built a fully functional file uploader that stores files directly to AWS S3 Bucket using simple AWS SDK for Python(boto3) and JavaScript! This guide covered:

  • Configuring your Web App with AWS S3 using AWS SDK for Python boto3.

  • Creating endpoints for file upload and file listing.

  • Building a responsive front end with HTML, CSS, and JavaScript.

  • Implementing real-time progress updates during file uploads.

This project not only demonstrates the power of combining these technologies but also gives you a solid foundation for integrating cloud storage into your own projects.

For more information and reference on this, please follow
Boto3 Documentation : Uploading Files


Now that we've built a robust AWS S3 file uploader with Flask and JavaScript, our journey has just begun. I encourage you to take this guide further by adapting and expanding upon it to fit your unique needs. Experiment with various hosting providers—whether it's Heroku, AWS Elastic Beanstalk, DigitalOcean, or any another host platform—to deploy your application and scale it to meet real-world demands or integrate to create API Endpoint to use it in your app.

As you host and optimize your project, keep iterating: add new features, refine your code, and tailor the experience for your users. Share your improvements and insights with the community, and collaborate with fellow developers to push the boundaries of what's possible. Your creativity and persistence are the keys to transforming this simple guide into a fully-fledged, dynamic solution.

I hope this blog post has been helpful. If you have any further questions or encounter any issues, please feel free to leave a comment below.

Thank you for reading! Happy Learning!

Like and Follow for more content.

Thank you,
Jineshkumar Patel

0
Subscribe to my newsletter

Read articles from Jineshkumar Patel directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Jineshkumar Patel
Jineshkumar Patel

Cloud Enthusiast working as Cloud Infrastructure Consultant. My Hobby is to build and destroy Cloud Projects for Blogs. Love to share my learning journey about DevOps, AWS and Azure. Subscribe and Follow up with "CloudCubes".Thank you and Happy Learning !!