Mastering HTTP, JWT, bcrypt, Mongoose Aggregate Pagination, File Uploads with Multer and Cloudinary in Node.js

Introduction
When building a modern web application with Node.js, MongoDB, and Mongoose, you often need to:
Secure user authentication with hashed passwords.
Issue JWT tokens for authorization.
Efficiently paginate large datasets using Mongoose aggregation.
This article deep-dives into these topics, explaining JWT, bcrypt, and mongoose-aggregate-paginate-v2—their use cases, implementation, and best practices.
1. Securing User Authentication with JWT & bcrypt
Why Use bcrypt for Password Hashing?
When users register, we store their passwords in a database. However, storing plaintext passwords is a security disasterwaiting to happen. If your database is ever leaked, attackers can see user passwords in plain text.
To prevent this, we use bcrypt, a hashing algorithm designed specifically for securing passwords.
How bcrypt Works
Salting – A random value (salt) is added to the password before hashing to prevent rainbow table attacks.
Hashing – bcrypt applies multiple rounds of hashing, making brute-force attacks impractical.
Implementing bcrypt in Mongoose Schema
import mongoose, { Schema } from 'mongoose';
import bcrypt from 'bcrypt';
const userSchema = new Schema(
{
username: { type: String, required: true, unique: true },
email: { type: String, required: true, unique: true },
password: { type: String, required: true }
},
{ timestamps: true }
);
// Hash password before saving
userSchema.pre('save', async function (next) {
if (!this.isModified('password')) return next();
const salt = await bcrypt.genSalt(10); // Generate salt
this.password = await bcrypt.hash(this.password, salt); // Hash password
next();
});
// Method to verify password
userSchema.methods.isPasswordCorrect = async function (enteredPassword) {
return await bcrypt.compare(enteredPassword, this.password);
};
const User = mongoose.model('User', userSchema);
export default User;
Best Practices for bcrypt
✔ Always use a salt when hashing passwords.
✔ Use a work factor (cost factor) of at least 10 for security.
✔ Never store plaintext passwords—store only the hashed version.
2. Authentication with JWT (JSON Web Token)
Why Use JWT?
JWT allows stateless authentication—once a user logs in, the server doesn’t need to store their session. Instead, the client holds a token, which is sent with every request for authentication.
Structure of a JWT Token
A JWT consists of three parts:
Header.Payload.Signature
Example JWT token:
eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9
.eyJ1c2VySWQiOiIxMjM0NTY3ODkwIiwiaWF0IjoxNjA3NTk2MjAwLCJleHAiOjE2MDc1OTk4MDB9
.4ZZJZ6Z8Gx8zW_o9BbMsoMNY_yZsUAGU9nS11l7NpXw
Generating Access and Refresh Tokens
We need two types of tokens:
Access Token – Short-lived, used for API requests.
Refresh Token – Long-lived, used to generate new access tokens.
Implementing JWT in User Model
import jwt from 'jsonwebtoken';
userSchema.methods.generateAccessToken = function () {
return jwt.sign(
{
_id: this._id,
email: this.email,
username: this.username
},
process.env.ACCESS_TOKEN_SECRET,
{ expiresIn: process.env.ACCESS_TOKEN_EXPIRY } // Typically 15 min
);
};
userSchema.methods.generateRefreshToken = function () {
return jwt.sign(
{ _id: this._id },
process.env.REFRESH_TOKEN_SECRET,
{ expiresIn: process.env.REFRESH_TOKEN_EXPIRY } // Typically 7 days
);
};
Best Practices for JWT
✔ Store refresh tokens securely (HTTP-only cookies, Redis, or DB).
✔ Use short-lived access tokens (15 min) and long-lived refresh tokens.
✔ Secure your JWT secret keys with .env
files.
3. Efficient Pagination with Mongoose Aggregate Paginate
The Problem: Querying Large Datasets
When dealing with large datasets (e.g., millions of videos in a video-sharing app), fetching all records at once is inefficient. Instead, we use pagination to fetch results in chunks.
What is mongoose-aggregate-paginate-v2
?
mongoose-aggregate-paginate-v2
helps paginate MongoDB aggregation queries efficiently.
Why Aggregation?
Unlike simple .find()
queries, MongoDB aggregation pipelines allow advanced data processing like:
✅ Filtering results ($match
).
✅ Sorting ($sort
).
✅ Grouping and counting ($group
).
✅ Projecting fields ($project
).
Implementing Pagination in a Video Model
import mongoose, { Schema } from 'mongoose';
import mongooseAggregatePaginate from 'mongoose-aggregate-paginate-v2';
const videoSchema = new Schema(
{
title: { type: String, required: true },
owner: { type: Schema.Types.ObjectId, ref: 'User', required: true },
views: { type: Number, default: 0 },
isPublished: { type: Boolean, default: true }
},
{ timestamps: true }
);
// Add pagination plugin
videoSchema.plugin(mongooseAggregatePaginate);
const Video = mongoose.model('Video', videoSchema);
export default Video;
Paginating Video Data
const options = {
page: 1,
limit: 10,
sort: { createdAt: -1 }
};
const aggregateQuery = Video.aggregate([
{ $match: { isPublished: true } },
{ $sort: { views: -1 } }
]);
const result = await Video.aggregatePaginate(aggregateQuery, options);
console.log(result);
Output Example
{
"docs": [ /* paginated video results */ ],
"totalDocs": 5000,
"limit": 10,
"totalPages": 500,
"page": 1,
"hasPrevPage": false,
"hasNextPage": true,
"prevPage": null,
"nextPage": 2
}
Best Practices for Pagination
✔ Always sort results ($sort
) for consistency.
✔ Use indexed fields (views
, isPublished
, createdAt
) for faster queries.
✔ Choose an optimal page size (e.g., 10, 20, 50).
Introduction
Handling file uploads is a crucial part of many web applications, from uploading profile pictures to handling video uploads. However, storing files efficiently and securely requires a robust system.
In this guide, we will cover:
✅ Multer – Handling file uploads in Node.js.
✅ Cloudinary – Storing files in the cloud instead of your local system.
✅ Best practices for efficient and secure file uploads.
Let’s deep dive into the implementation and see how we can upload, process, and manage files effectively.
1. Why Use Multer for File Uploads?
Multer is a middleware for handling multipart/form-data
, which is primarily used for file uploads in Express.js applications. Unlike raw body-parsing middleware like express.json()
, Multer processes files efficiently and stores them on the disk or in memory before further processing.
Installing Multer
npm install multer
2. Setting Up Multer for File Storage
We need a storage engine for storing uploaded files before processing them. Multer provides two storage options:
diskStorage
(Storing files on the server)memoryStorage
(Storing files in RAM for temporary processing)
Here’s how we configure Multer to store files locally:
import multer from 'multer';
// Define storage location and filename strategy
const storage = multer.diskStorage({
destination: function (req, file, cb) {
cb(null, '../../public/temp'); // Files will be temporarily stored here
},
filename: function (req, file, cb) {
cb(null, file.originalname); // Keep the original file name
}
});
// Initialize multer with storage configuration
export const upload = multer({ storage });
How It Works
✅ destination
: Defines where the file should be stored.
✅ filename
: Ensures that the file retains its original name.
⚠️ Why Not Store Files on a Local Server?
Storing files locally can overload your server, especially for large applications.
Instead, files should be offloaded to a cloud storage service like Cloudinary, AWS S3, or Firebase Storage.
3. Cloudinary: Cloud Storage for Images & Videos
Why Use Cloudinary?
Cloudinary is a powerful cloud storage solution that provides:
✅ Secure file hosting (no manual server management).
✅ Automatic optimization (resizing, compression, and CDN delivery).
✅ Fast global access (delivered via CDN).
Installing Cloudinary SDK
npm install cloudinary
4. Configuring Cloudinary in Node.js
To interact with Cloudinary, we must first configure it with API credentials.
import { v2 as cloudinary } from 'cloudinary';
import fs from 'fs';
// Cloudinary configuration
cloudinary.config({
cloud_name: process.env.CLOUDINARY_CLOUD_NAME,
api_key: process.env.CLOUDINARY_API_KEY,
api_secret: process.env.CLOUDINARY_API_SECRET
});
Environment Variables (.env
file)
CLOUDINARY_CLOUD_NAME=your_cloud_name
CLOUDINARY_API_KEY=your_api_key
CLOUDINARY_API_SECRET=your_api_secret
5. Uploading Files to Cloudinary
Once a file is uploaded using Multer, we move it from the local storage to Cloudinary and delete the local copy.
const uploadOnCloudinary = async (localFilePath) => {
try {
if (!localFilePath) return null; // Ensure a file is provided
const response = await cloudinary.uploader.upload(localFilePath, {
resource_type: 'auto' // Auto-detect file type (image, video, etc.)
});
console.log(`File uploaded to Cloudinary: ${response.url}`);
// Delete the local file after successful upload
fs.unlinkSync(localFilePath);
return response;
} catch (error) {
console.error("Cloudinary upload failed:", error);
// Delete the file even if the upload fails (prevent temp file accumulation)
fs.unlinkSync(localFilePath);
return null;
}
};
export { uploadOnCloudinary };
Breaking It Down
✅ cloudinary.uploader.upload()
– Uploads the file.
✅ resource_type: 'auto'
– Automatically detects if the file is an image, video, or document.
✅ fs.unlinkSync(localFilePath)
– Deletes the local file to free up server space.
6. Integrating Multer with Cloudinary for File Upload
Here’s how we integrate everything in an Express.js route:
import express from 'express';
import { upload } from './multerConfig.js';
import { uploadOnCloudinary } from './cloudinaryConfig.js';
const router = express.Router();
router.post('/upload', upload.single('file'), async (req, res) => {
try {
if (!req.file) return res.status(400).json({ message: "No file uploaded!" });
// Upload file to Cloudinary
const cloudinaryResponse = await uploadOnCloudinary(req.file.path);
if (!cloudinaryResponse) {
return res.status(500).json({ message: "Cloudinary upload failed!" });
}
res.status(200).json({
message: "File uploaded successfully!",
fileUrl: cloudinaryResponse.secure_url
});
} catch (error) {
res.status(500).json({ message: "Server error", error });
}
});
export default router;
Key Features
✅ upload.single('file')
– Handles single-file uploads.
✅ req.file.path
– Retrieves the file path stored by Multer.
✅ uploadOnCloudinary(req.file.path)
– Moves the file to Cloudinary.
✅ Returns secure_url
– The Cloudinary URL of the uploaded file.
7. Best Practices for File Uploads
1️⃣ Secure File Uploads
✔ Restrict file types (e.g., only allow .png
, .jpg
, .mp4
).
✔ Limit file size to prevent DDoS attacks.
const upload = multer({
storage,
limits: { fileSize: 5 * 1024 * 1024 }, // Limit 5MB
fileFilter: (req, file, cb) => {
const allowedTypes = ["image/png", "image/jpeg", "video/mp4"];
if (!allowedTypes.includes(file.mimetype)) {
return cb(new Error("Only .png, .jpg and .mp4 files are allowed!"), false);
}
cb(null, true);
}
});
2️⃣ Use Cloud Storage Instead of Local Storage
✔ Cloudinary, AWS S3, or Firebase Storage are better than local storage.
✔ Prevents server overload and ensures scalability.
3️⃣ Delete Temporary Files After Upload
✔ Prevents disk storage from filling up.
✔ Use fs.unlinkSync()
after every upload.
The HyperText Transfer Protocol (HTTP) is the foundation of web communication. Whether you're a backend developer, frontend engineer, or API architect, understanding HTTP deeply is critical for building performant, secure, and scalable applications.
This guide covers:
✅ Headers & Metadata (request, response, caching, authentication).
✅ State Management (cookies, sessions, tokens).
✅ Security Best Practices (CORS, COOP, COEP, CSP, XSS protection).
✅ HTTP Methods & Status Codes in detail.
1. HTTP Headers & Metadata
HTTP headers allow clients (browsers, APIs) and servers to exchange metadata. Headers define:
✔ Content type (JSON, HTML, XML).
✔ Authentication & authorization.
✔ Caching strategies.
✔ Security policies.
Types of HTTP Headers
1️⃣ Request Headers – Sent by the client (browser, API).
2️⃣ Response Headers – Sent by the server in response.
3️⃣ Representation Headers – Describe the body content (e.g., JSON, HTML).
4️⃣ Payload Headers – Handle encoding, compression, and size.
2. Most Common HTTP Headers
✔ Accept
📌 Tells the server what format the client expects.
Accept: application/json
💡 The server will return JSON. Other possible values:
Accept: text/html
(for web pages).Accept: image/png
(for images).
✔ User-Agent
📌 Identifies the client (browser, API, or bot).
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64)
💡 Useful for analytics, debugging, and bot detection.
✔ Cookie
📌 Sends session data from the client to the server.
Cookie: session_id=abc123; theme=dark
💡 Used for stateful authentication & session tracking.
✔ Cache-Control
📌 Defines caching rules for browsers and proxies.
Cache-Control: no-cache, no-store, must-revalidate
💡 Common values:
no-cache
→ Fetch from the server every time.max-age=3600
→ Cache for 1 hour.private
→ Only cache in the user's browser, not shared proxies.
✔ Authorization
📌 Sends authentication credentials (Bearer Tokens, Basic Auth, API Keys).
Authorization: Bearer eyJhbGciOiJIUzI1NiIsInR5cCI6...
💡 Used for JWT authentication in APIs.
3. CORS (Cross-Origin Resource Sharing)
By default, browsers block requests from different origins for security. CORS headers allow or restrict access.
✔ Allowing Cross-Origin Requests
Access-Control-Allow-Origin: https://example.com
Access-Control-Allow-Methods: GET, POST, PUT
Access-Control-Allow-Credentials: true
💡 Ensures security while enabling controlled cross-origin access.
4. HTTP Security Headers
Protect your web app against attacks (XSS, clickjacking, injection) with security headers.
✔ COOP (Cross-Origin Opener Policy)
📌 Prevents cross-origin access to window.opener
.
Cross-Origin-Opener-Policy: same-origin
💡 Prevents cross-origin attacks like tab hijacking.
✔ COEP (Cross-Origin Embedder Policy)
📌 Blocks insecure cross-origin resources.
Cross-Origin-Embedder-Policy: require-corp
💡 Prevents loading resources from untrusted origins.
✔ CSP (Content Security Policy)
📌 Blocks XSS attacks by restricting scripts, styles, and resources.
Content-Security-Policy: default-src 'self'; script-src 'self' https://trusted.com
💡 Ensures scripts & styles only load from trusted sources.
✔ XSS Protection
📌 Prevents Cross-Site Scripting (XSS).
X-XSS-Protection: 1; mode=block
💡 Browsers block malicious scripts injected into the page.
5. Managing State in HTTP
HTTP is stateless, meaning each request is independent. To persist user state, we use:
✔ Cookies – Store small amounts of data in the browser.
✔ Sessions – Store session data on the server.
✔ JWT Tokens – Secure authentication without server-side session storage.
✔ Cookies Example
Set-Cookie: session_id=abc123; HttpOnly; Secure; SameSite=Strict
💡 HttpOnly
prevents JavaScript access (security against XSS).
6. HTTP Methods (In Depth)
HTTP defines methods (verbs) for different types of requests.
✔ GET (Retrieve Data)
GET /users HTTP/1.1
💡 Never modify data using GET.
✔ POST (Create Data)
POST /users HTTP/1.1
Content-Type: application/json
{
"name": "John Doe",
"email": "john@example.com"
}
💡 Used for form submissions, API requests, user registration.
✔ PUT (Update Data)
PUT /users/123 HTTP/1.1
Content-Type: application/json
{
"name": "Updated Name"
}
💡 Replaces the entire resource.
✔ PATCH (Modify Data)
PATCH /users/123 HTTP/1.1
Content-Type: application/json
{
"email": "new@example.com"
}
💡 Partially updates a resource.
✔ DELETE (Remove Data)
DELETE /users/123 HTTP/1.1
💡 Deletes a user from the system.
7. HTTP Status Codes (In Depth)
1XX – Informational
100 Continue
– Request received, continue sending.
2XX – Success
200 OK
– Successful request.201 Created
– Resource successfully created.204 No Content
– Request succeeded, but no response body.
3XX – Redirection
301 Moved Permanently
– Use the new URL for future requests.302 Found
– Temporary redirection.304 Not Modified
– Use cached response instead.
4XX – Client Errors
400 Bad Request
– Invalid request format.401 Unauthorized
– Missing or invalid authentication.403 Forbidden
– Access denied.404 Not Found
– Resource doesn’t exist.
5XX – Server Errors
500 Internal Server Error
– Something broke on the server.502 Bad Gateway
– Server acting as a gateway failed.503 Service Unavailable
– Server is overloaded or down.
Conclusion
🚀 Key Takeaways:
✔ Headers define metadata (caching, authentication, security).
✔ CORS controls cross-origin access.
✔ Security headers prevent XSS, injection, and clickjacking.
✔ State management (cookies, sessions, JWT) ensures user persistence.
✔ Proper use of HTTP methods & status codes makes APIs intuitive.
🚀 Multer: Handles file uploads in Express.js.
🚀 Cloudinary: Stores images, videos, and documents in the cloud.
🚀 Best Practices: Secure file types, limit size, and clean temp files\
🔥 bcrypt – For secure password hashing.
🔥 JWT – For user authentication.
🔥 mongoose-aggregate-paginate-v2 – For efficient pagination.
These are must-know tools for any Node.js developer building scalable applications. By following best practices, you can ensure your application is secure, performant, and maintainable.
Subscribe to my newsletter
Read articles from Madhur directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Madhur
Madhur
Hi! I'm Madhur, currently pursuing a B.Tech in Computer Science. I've embarked on a journey of learning DevOps through open-source contributions while also developing my coding skills in Golang and C++. For the past year, I've been deeply involved in the DevOps realm, working on hands-on projects to refine my abilities. My experience includes navigating the AWS cloud ecosystem, where I've mastered services like EC2, S3, VPC, IAM, CloudFormation, and the CI/CD tools—CodeCommit, CodeBuild, CodeDeploy, and CodePipeline. This practical exposure has helped me understand how to efficiently manage cloud resources for various applications. I've also embraced containerization using Docker, and for configuration management, I work with Ansible. Moreover, I’ve gained experience in infrastructure automation by writing scripts with both Terraform and CloudFormation templates, ensuring scalable and resilient setups. In short, my DevOps journey has been defined by continuous learning and hands-on practice with essential tools and techniques for building robust systems. Along the way, I’ve also taken up blogging, contributing articles on Linux, Networking, Docker, AWS Cloud services, and best practices for Git and GitHub.