How to Set Up AWS SFTP Server with S3 Integration: A Complete Guide

Introduction
In today's cloud-first world, secure file transfer is crucial for businesses. AWS Transfer Family provides a fully managed file transfer service that enables you to set up SFTP servers that directly integrate with Amazon S3. This tutorial will walk you through creating a production-ready SFTP server with authentication, security, and S3 integration.
SFTP vs FTP vs FTPS: Understanding the Differences
Before diving into the setup, let's understand the key differences:
Protocol | Security | Port | Encryption | Use Case |
FTP | β None | 21 | No | Legacy systems (not recommended) |
FTPS | β SSL/TLS | 21/990 | Yes | Secure FTP with certificates |
SFTP | β SSH | 22 | Yes | Most secure, SSH-based |
Why Choose SFTP?
π Strong encryption using SSH protocol
π― Single port (22) - firewall friendly
π Multiple authentication methods (keys, passwords)
π Widely supported across platforms
βοΈ Perfect for cloud integration
Architecture Overview
graph TB
A[Client SFTP] -->|Port 22| B[AWS Transfer Family]
B --> C[Lambda Authentication]
B --> D[Amazon S3 Bucket]
C --> E[IAM Roles & Policies]
F[Route 53] --> B
G[CloudWatch] --> B
Prerequisites
AWS Account with appropriate permissions
Basic understanding of IAM roles and policies
S3 bucket for file storage
Domain name (optional, for custom hostname)
Step 1: Create S3 Bucket
First, create an S3 bucket to store uploaded files:
# Create S3 bucket
aws s3 mb s3://your-sftp-bucket --region us-east-1
# Enable versioning (optional)
aws s3api put-bucket-versioning \
--bucket your-sftp-bucket \
--versioning-configuration Status=Enabled
Step 2: Create IAM Roles and Policies
2.1 Create S3 Access Policy
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"s3:ListBucket",
"s3:GetBucketLocation"
],
"Resource": "arn:aws:s3:::your-sftp-bucket"
},
{
"Effect": "Allow",
"Action": [
"s3:GetObject",
"s3:PutObject"
],
"Resource": "arn:aws:s3:::your-sftp-bucket/*"
}
]
}
2.2 Create SFTP Access Role
# Create role for SFTP users
aws iam create-role \
--role-name SFTP-S3-Access-Role \
--assume-role-policy-document '{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "transfer.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}'
# Attach policy to role
aws iam attach-role-policy \
--role-name SFTP-S3-Access-Role \
--policy-arn arn:aws:iam::YOUR-ACCOUNT-ID:policy/SFTP-S3-Access-Policy
Step 3: Create Lambda Authentication Function
For custom username/password authentication, create a Lambda function:
import json
import boto3
def lambda_handler(event, context):
# Extract credentials and source IP
username = event.get('username', '')
password = event.get('password', '')
source_ip = event.get('sourceIp', '')
# Define allowed IPs (optional security layer)
allowed_ips = [
'203.0.113.0', # Example IP
'198.51.100.0', # Example IP
# Add your client IPs here
]
# IP filtering (optional)
if allowed_ips and source_ip not in allowed_ips:
print(f"Access denied for IP: {source_ip}")
return {}
# User database (in production, use AWS Secrets Manager)
users = {
'sftp-user': {
'password': 'SecurePassword123!',
'role': 'arn:aws:iam::YOUR-ACCOUNT-ID:role/SFTP-S3-Access-Role',
'home_directory': '/your-sftp-bucket',
'policy': json.dumps({
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": ["s3:ListBucket"],
"Resource": "arn:aws:s3:::your-sftp-bucket"
},
{
"Effect": "Allow",
"Action": ["s3:GetObject", "s3:PutObject"],
"Resource": "arn:aws:s3:::your-sftp-bucket/*"
}
]
})
}
}
# Authenticate user
if username in users and users[username]['password'] == password:
print(f"Authentication successful for {username} from {source_ip}")
return {
'Role': users[username]['role'],
'HomeDirectory': users[username]['home_directory'],
'Policy': users[username]['policy']
}
else:
print(f"Authentication failed for {username}")
return {}
Grant Transfer Family Permission to Invoke Lambda
aws lambda add-permission \
--function-name sftp-auth-function \
--statement-id "transfer-family-invoke" \
--action "lambda:InvokeFunction" \
--principal "transfer.amazonaws.com" \
--source-arn "arn:aws:transfer:us-east-1:YOUR-ACCOUNT-ID:server/SERVER-ID"
Step 4: Create Transfer Family SFTP Server
4.1 Using AWS Console
Navigate to AWS Transfer Family Console
Click "Create server"
Configure settings:
Protocols: SFTP
Identity provider: Custom (AWS Lambda)
Lambda function: Select your authentication function
Endpoint: Publicly accessible
Domain: Amazon S3
4.2 Using AWS CLI
aws transfer create-server \
--identity-provider-type AWS_LAMBDA \
--identity-provider-details Function=arn:aws:lambda:us-east-1:YOUR-ACCOUNT-ID:function:sftp-auth-function \
--endpoint-type PUBLIC \
--domain S3 \
--protocols SFTP \
--tags Key=Name,Value=Production-SFTP-Server
Step 5: Configure Custom Domain (Optional)
5.1 Update Route 53 DNS Record
# Create CNAME record pointing to your Transfer Family endpoint
aws route53 change-resource-record-sets \
--hosted-zone-id YOUR-ZONE-ID \
--change-batch '{
"Changes": [{
"Action": "CREATE",
"ResourceRecordSet": {
"Name": "sftp.yourdomain.com",
"Type": "CNAME",
"TTL": 300,
"ResourceRecords": [{"Value": "s-serverid.server.transfer.us-east-1.amazonaws.com"}]
}
}]
}'
Step 6: Test Your SFTP Server
6.1 Command Line Testing
# Test connection (Linux/Mac)
sftp sftp-user@s-serverid.server.transfer.us-east-1.amazonaws.com
# Or with custom domain
sftp sftp-user@sftp.yourdomain.com
# Upload a file
sftp> put localfile.txt
sftp> ls
sftp> exit
6.2 Using SFTP Clients
FileZilla Configuration:
Host:
sftp.yourdomain.com
Protocol: SFTP - SSH File Transfer Protocol
Port: 22
Username:
sftp-user
Password:
SecurePassword123!
Step 7: Security Best Practices
7.1 IP Whitelisting
Implement IP restrictions in your Lambda function:
# Add to your Lambda function
allowed_ips = [
'203.0.113.0/24', # Office network
'198.51.100.5', # Specific client IP
]
7.2 Strong Authentication
Use complex passwords or SSH keys
Implement multi-factor authentication where possible
Rotate credentials regularly
7.3 Monitoring and Logging
Enable CloudWatch logging:
# Create log group
aws logs create-log-group --log-group-name /aws/transfer/s-serverid
# Enable structured logging on your server
aws transfer update-server \
--server-id s-serverid \
--structured-log-destinations arn:aws:logs:us-east-1:account:log-group:/aws/transfer/s-serverid
Step 8: Cost Optimization
8.1 S3 Lifecycle Policies
Configure automatic archiving for cost savings:
{
"Rules": [{
"ID": "ArchiveUploads",
"Status": "Enabled",
"Filter": {},
"Transitions": [{
"Days": 30,
"StorageClass": "STANDARD_IA"
}, {
"Days": 90,
"StorageClass": "GLACIER"
}]
}]
}
8.2 Server Management
Stop servers when not in use
Monitor data transfer costs
Use VPC endpoints for internal traffic
Complete SFTP Details Template
Here's what you'll provide to your clients:
SFTP Connection Details:
ββββββββββββββββββββββββββββ
SFTP URL: sftp.yourdomain.com
Username: sftp-user
Password: SecurePassword123!
Port: 22 (default)
Protocol: SFTP
Path: / (root directory)
ββββββββββββββββββββββββββββ
Client Instructions:
1. Use any SFTP client (FileZilla, WinSCP, command line)
2. Connect using above credentials
3. Upload files to root directory
4. Files will automatically appear in S3 bucket
Troubleshooting Common Issues
Connection Timeouts
Check security groups (for VPC endpoints)
Verify DNS resolution
Confirm server is in "Online" status
Authentication Failures
Check Lambda function logs in CloudWatch
Verify IAM permissions
Test credentials manually
File Upload Issues
Confirm S3 bucket permissions
Check IAM role policies
Verify home directory configuration
Monitoring and Maintenance
CloudWatch Metrics
Monitor these key metrics:
ConnectionCount: Active connections
BytesIn/BytesOut: Data transfer
FilesIn/FilesOut: File transfer count
Regular Tasks
Review access logs monthly
Update credentials quarterly
Monitor S3 storage costs
Test disaster recovery procedures
Conclusion
AWS Transfer Family provides a robust, scalable solution for SFTP file transfers with direct S3 integration. This setup offers:
β
Enterprise-grade security with encryption and authentication
β
Seamless S3 integration for cloud-native workflows
β
Cost-effective storage with lifecycle policies
β
High availability with AWS managed infrastructure
β
Custom domain support for professional branding
The combination of Lambda authentication, IP restrictions, and S3 lifecycle policies creates a production-ready file transfer solution that scales with your business needs.
Next Steps:
Implement monitoring dashboards
Set up automated backups
Configure disaster recovery procedures
Scale to multiple regions if needed
Have you successfully set up your AWS SFTP server? Share your experience and any additional tips in the comments below!
Final Architecture
Resources
Tags: #AWS #SFTP #CloudComputing #FileTransfer #S3 #DevOps #CloudSecurity
Subscribe to my newsletter
Read articles from Harshit Paneri directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Harshit Paneri
Harshit Paneri
π Hi, I'm Harshit, a versatile Full Stack Developer with a passion for Software Development and DevOps. With a background in Computer Science Engineering, I've honed my skills to not just build robust APIs but also ensure they run seamlessly in production. From coding the backend to setting up CI/CD pipelines, I manage the entire lifecycle of applicationsβdeveloping, deploying, and scaling them as needed. Whether it's integrating cutting-edge technologies, optimizing performance, or setting up monitoring and alerting for smooth operations, I make sure everything runs like a well-oiled machine. In short, Iβm the go-to person for creating solutions that are reliable, scalable, and production-ready.