๐ Migrating from MongoDB to PostgreSQL โ A Developer's Complete Guide

As developers evolve applications from flexible NoSQL setups to structured, scalable relational systems, migrating from MongoDB to PostgreSQL becomes a powerful move. But with great power comes great complexity.
This guide walks through the complete journey of migrating a Node.js app from MongoDB to PostgreSQL โ from schema introspection to data normalization, tool selection, and best practices.
๐ Why Migrate from MongoDB to PostgreSQL?
โ Better support for relational data and constraints
โ Complex querying with JOINs, views, indexes
โ Strict schema enforcement and data integrity
โ Easier reporting and analytics
โ Lower cost for some workloads at scale
๐ Challenges in Migrating from MongoDB to PostgreSQL
โ MongoDB is schemaless โ Postgres is schema-strict
โ Embedded documents and arrays are not 1:1 mappable
โ No built-in foreign keys in MongoDB means no relation inference
โ Inconsistent data types across documents
โ High-Level Migration Plan
Inventory collections and their structure
Introspect MongoDB schema using Prisma
Design PostgreSQL schema with normalization and relations
Migrate data (via scripts or ETL tools)
Update application code
Test and deploy
โ๏ธ Tools for Migration
Tool | Purpose | Notes |
Prisma | Introspect MongoDB, generate schema | Requires manual relational fixes |
Hackolade | Visual modeling & normalization | Paid, great for visual work |
Airbyte | ETL from MongoDB to PostgreSQL | Good for moving raw data |
Mongify | Legacy tool for mapping & migration | Not actively maintained |
๐ง Using Prisma for MongoDB Introspection
Step 1: Setup Project
mkdir mongo-to-postgres && cd mongo-to-postgres
npm init -y
npm install prisma --save-dev
npx prisma init
Step 2: Configure MongoDB as Data Source
generator client {
provider = "prisma-client-js"
}
datasource db {
provider = "mongodb"
url = env("DATABASE_URL")
}
Update .env
:
DATABASE_URL="mongodb+srv://user:pass@cluster.mongodb.net/dbname"
Step 3: Introspect MongoDB
npx prisma db pull
Prisma will generate models from MongoDB collections.
Example generated model:
prismaCopyEditmodel User {
id String @id @map("_id") @default(auto()) @test.ObjectId
name String?
email String?
@@map("users")
}
Step 4: Switch to PostgreSQL
Update schema.prisma
:
datasource db {
provider = "postgresql"
url = env("DATABASE_URL")
}
Convert ObjectId fields and composite types to relational-friendly formats.
Remove Mongo-specific fields:
Replace
@test.ObjectId
with@default(uuid())
or@id
with@default(autoincrement())
Remove
@map("_id")
unless you're preserving namingAdjust types like
Json
,String?
,Int?
, etc.
Example after conversion:
prismaCopyEditmodel User {
id Int @id @default(autoincrement())
name String
email String @unique
}
๐๏ธ Designing the Relational Schema
โ What Prisma Won't Do:
Wonโt create
@relation
mappingsWonโt normalize embedded objects into new tables
Wonโt enforce foreign key constraints
โ What You Should Do:
Manually split composite types into separate models
Use
@relation
to link models via foreign keysConvert MongoDB array fields into child tables
๐พ 6. Create PostgreSQL DB and Update .env
In .env
, update your Postgres DB URL:
dotenvCopyEditDATABASE_URL="postgresql://user:password@localhost:5432/mydb"
Test the connection:
bashCopyEditnpx prisma generate
๐ 7. Push Schema to PostgreSQL
Create the tables in PostgreSQL using:
bashCopyEditnpx prisma migrate dev --name init
๐ก This creates a migration file and syncs the schema.
๐ ๏ธ Migrating the Data
Option A: Custom Node.js Script with Prisma Client
const { PrismaClient } = require('@prisma/client');
const mongoose = require('mongoose');
const prisma = new PrismaClient();
(async () => {
await mongoose.connect('mongodb://localhost/mydb');
const UserMongo = mongoose.model('User', new mongoose.Schema({}, { strict: false }));
const users = await UserMongo.find();
for (let user of users) {
await prisma.user.create({
data: {
name: user.name,
email: user.email,
}
});
}
})();
Option B: Use Airbyte or ETL tool
Setup source (MongoDB) and destination (Postgres)
Map collections to tables (basic flattening)
Normalize after import using SQL
๐ง Lessons Learned
MongoDB โ PostgreSQL is not a direct mapping
Relations must be designed manually
Prisma helps introspect and define schema, but not normalize
Tools like Hackolade and Airbyte make parts easier, but not perfect
โจ Final Thoughts
MongoDB gives freedom, PostgreSQL gives structure. Migrating between them requires discipline, tools, and domain knowledge. Use Prisma to bootstrap your schema, then layer in normalization and relational structure.
With the right plan, you can scale your backend from flexible MVP to production-grade system โ the relational way.
Automated MongoDB-to-PostgreSQL migration falls short:
๐ฅ No tool today can perfectly normalize and convert MongoDB collections into a relational PostgreSQL schema with accurate relationships.
Why? Because MongoDB is schemaless and lacks foreign key constraints โ meaning no tool can safely guess your business logic and data relationships without human help.
That said, here are your best options to get as close as possible to what you're asking:
๐ ๏ธ 1. MongoDB Compass + Hackolade (Semi-Auto Normalization Tool)
๐น Hackolade is a visual data modeling tool made exactly for:
Reverse-engineering NoSQL databases (like MongoDB)
Designing normalized relational schemas for SQL databases (like PostgreSQL)
โ Pros:
Can connect to your MongoDB
Automatically infers data models
Provides suggested relational mapping
Allows drag-and-drop to design foreign keys and normalization
Can export SQL schema for PostgreSQL
Good for large schemas (30โ50+ collections)
๐ซ Cons:
Itโs not free (but free trial available)
You still need to understand your domain to finalize schema
๐ This is the closest to what youโre asking for.
๐ ๏ธ 2. Airbyte + Manual Schema Conversion
Airbyte (https://airbyte.io/)
Open-source ETL tool
Can migrate data from MongoDB to PostgreSQL
You can set up JSON flattening rules (basic normalization)
But:
It wonโt infer relations
Just copies documents as-is into flat tables or JSON columns
Use case: If you just want to move raw data into PostgreSQL and normalize it later using SQL or scripts.
๐ ๏ธ 3. Prisma + AI-Assisted Refactoring (Best DIY path)
Use Prisma to introspect MongoDB, then:
Use a tool like ChatGPT (me ๐) or scripts to:
Auto-suggest foreign key relationships
Split embedded objects into separate tables
Generate relational Prisma schemas
Then generate and migrate your schema
This gives total control + AI assistance but is not fully auto.
๐ ๏ธ 4. Mongify (legacy, no longer maintained but works)
https://github.com/anlek/mongify
Older tool to migrate MongoDB to SQL databases
Allows you to define a mapping file where you describe relationships manually
Can normalize some data if configured
๐ก Not actively maintained
๐ข Still usable for simple use cases
๐ก Realistic Strategy (Recommended for Production Teams)
Step | Tool |
Extract schema from MongoDB | Prisma or Hackolade |
Visualize & normalize data model | Hackolade |
Export SQL schema | Hackolade or manually with Prisma |
Migrate data | Airbyte or custom script |
Maintain long term | Prisma ORM with PostgreSQL |
๐ง Want Full Automation? Here's the Truth:
There is no tool that will both:
Automatically extract and normalize MongoDB
Create relational schemas with perfect foreign keys
Migrate and preserve business logic
Because that requires human domain understanding.
Subscribe to my newsletter
Read articles from Muhammad Sufiyan directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Muhammad Sufiyan
Muhammad Sufiyan
As a former 3D Animator with more than 12 years of experience, I have always been fascinated by the intersection of technology and creativity. That's why I recently shifted my career towards MERN stack development and software engineering, where I have been serving since 2021. With my background in 3D animation, I bring a unique perspective to software development, combining creativity and technical expertise to build innovative and visually engaging applications. I have a passion for learning and staying up-to-date with the latest technologies and best practices, and I enjoy collaborating with cross-functional teams to solve complex problems and create seamless user experiences. In my current role as a MERN stack developer, I have been responsible for developing and implementing web applications using MongoDB, Express, React, and Node.js. I have also gained experience in Agile development methodologies, version control with Git, and cloud-based deployment using platforms like Heroku and AWS. I am committed to delivering high-quality work that meets the needs of both clients and end-users, and I am always seeking new challenges and opportunities to grow both personally and professionally.