🐧Mounting an Amazon S3 Bucket in WSL Ubuntu Using s3fs

Dushyant KumarDushyant Kumar
4 min read

✅ Prerequisites

  • An Amazon S3 bucket (see here if you want to know how to create one). For the rest of the article we assume it is named my-s3-bucket and is created in the region ap-south-1.

  • An Amazon IAM user (with relative Access Key ID and Secret Access Key) with permission to access my-s3-bucket (see here for more info on that). Just for quick reference, in case a simple full access is the choice (this could be also a read-only access maybe, or read-write without delete or with delete), on the AWS Console go to IAM -> Create User (Next, Next, Create user), Create access key (save the keys for later) and Add permission with a policy like the following (substitute my-s3-bucket):

After adding permissions in the form of policy, then create access key and secret access key.

✅ Introduction

Recently, I had to work with an Amazon S3 bucket as part of a project but instead of uploading and downloading files manually or scripting with aws-cli, I wanted a more intuitive way to interact with it. Ideally, I wanted to mount the S3 bucket like a normal directory inside my Ubuntu WSL (Windows Subsystem for Linux).

After some trial and error (and permissions headaches), I got it working using s3fs. Here's a breakdown of the exact steps I followed, what each command does.

✅ What we use

  • WSL2 (Ubuntu 24.04 running via Windows Subsystem for Linux)

  • Amazon S3 bucket (mine is called aws-s3-snyk-buckett)

  • s3fs for mounting the bucket

  • AWS credentials with proper permissions to access the bucket

✔️ Step 1: Install s3fs

I started by updating my packages and installing s3fs. This tool lets you mount an S3 bucket as a FUSE filesystem — basically treating it like a folder.

sudo apt update
sudo apt install -y s3fs

✔️ Step 2: Setting Up AWS Credentials

s3fs doesn’t use the usual AWS credential files instead, it expects a file with this format. So we create this file.

Note: The chmod 600 step is critical. If the file isn't locked down, s3fs refuses to use it for security reasons.

echo ACCESS_KEY_ID:SECRET_ACCESS_KEY > ~/.passwd-s3fs
chmod 600 ~/.passwd-s3fs

✔️ Step 3 :Creating a Mount Point

Next, I created the local directory where I wanted the S3 bucket to appear sort of like plugging in a USB drive.

sudo mkdir -p /mnt/s3bucket
sudo chown $USER:$USER /mnt/s3bucket  #mount as current user
sudo chown $USER:$USER /mnt/s3bucket  #mount as specific user

✔️ Step 4 : Mounting the Bucket

This was the real magic.

s3fs aws-s3-snyk-buckett /mnt/s3bucket \
  -o use_cache=/tmp \
  -o passwd_file=/home/boat/.passwd-s3fs \
  -o url=https://s3.amazonaws.com \
  -o endpoint=ap-south-1 \
  -o umask=0022 \
  -o mp_umask=0022 \
  -o multireq_max=5 \
  -o parallel_count=15

OptionWhat It Does
use_cache=/tmpUses local caching to improve performance.
passwd_file=..Points to the credential file we made.
url=https://s3.amazonaws.comDefault URL for standard S3 buckets.
endpoint=ap-south-1Region where my bucket lives (Mumbai in my case).
umask=0022Sets permissions so files are 755 by default.
multireq_max=5Number of concurrent requests (improves performance).
parallel_count=15Number of parallel file transfers allowed.

✔️ Step 5 :Confirming the Mount

I wanted to be sure it worked, so I ran:

mount | grep s3fs

✔️ Step 6 :Testing the Drive

I navigated into a subfolder (which already existed on the bucket) and tried writing a file:

cd /mnt/s3bucket/custom_repo
echo "This is the new project for testing" >> projecta.txt
ll

File created successfully, you can also check on your s3 bucket.

✌️ Wrapping Up

Honestly, once I figured out the right options and permissions, this setup turned out to be pretty smooth. Now I can work with my S3 bucket just like a regular folder in Ubuntu WSL no need to keep jumping between CLI commands or the AWS console.

If you're also juggling S3 files often, give this a shot it’s a small setup that makes a big difference.

And if you get stuck anywhere, feel free to drop a comment or reach out. Been there 😅.

#s3bucket #aws #cloudcomputing #Devops #90daysofdevopsc #happylearning

Follow for many such contents:

LinkedIn: linkedin.com/in/dushyant-kumar-dk

Blog: dushyantkumark.hashnode.dev

0
Subscribe to my newsletter

Read articles from Dushyant Kumar directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Dushyant Kumar
Dushyant Kumar