Exploring Graph Databases with Neo4j & Relational Memory in Agents

Today’s GenAI class was a big one — we explored Graph Databases, specifically Neo4j, and how they can be used to give AI agents relational memory.
This might sound fancy, but the idea is simple: instead of AI only remembering “text blobs,” it also remembers relationships between people, places, and ideas.
By the end of today, I could see how agents can reason smarter and connect dots almost like a human brain 🤯.
🔗 Why Graph Databases for AI?
Traditional databases (like SQL) store data in tables. They’re great for structured queries like:
“Show me all students with GPA > 9”
“List all orders placed in the last 7 days”
Vector databases (like Pinecone, Qdrant, Weaviate) came later to handle semantic search — they let AI find text that means the same thing.
But there’s a gap. Vector DBs don’t handle relationships well.
Example
Suppose you store facts:
“Aman knows Piyush”
“Piyush works at ChaiCodeHQ”
“ChaiCodeHQ focuses on GenAI”
A vector DB can find these statements if you search for related terms.
But if you ask:
👉 “Where does Aman’s friend work?”
A plain vector DB struggles. It doesn’t “chain” relationships.
This is where Graph Databases like Neo4j shine.
They store data as nodes (entities) and edges (relationships).
They can run queries like:
MATCH (a:Person {name: "Aman"})-[:KNOWS]->(p:Person)-[:WORKS_AT]->(c:Company)
RETURN c.name;
Which returns:
👉 “ChaiCodeHQ”
That’s the magic. Graph DBs let AI connect the dots.
🏏 Indian-Style Examples
Let’s make this fun with examples we can relate to:
Cricket:
“Virat Kohli plays for RCB”
“RCB plays in IPL”
“IPL is held in India”
Query: “Where does Virat play and in which league?”
Graph DB easily answers → RCB → IPL → India.
Bollywood:
“Shah Rukh Khan acted in Pathaan”
“Pathaan was directed by Siddharth Anand”
“Siddharth Anand also directed War”
Query: “Which other movies are directed by SRK’s director?”
Answer → War.
Daily life:
“Ramesh is friends with Suresh”
“Suresh works at Infosys”
Query: “Where does Ramesh’s friend work?”
Answer → Infosys.
See how natural that feels? This is the human way of thinking.
🛠️ Setting Up Neo4j with Relational Memory
In class, we integrated Neo4j into a memory system using mem0
.
We combined two powerful stores:
Neo4j (Graph Store) → Stores relationships.
Qdrant (Vector Store) → Stores embeddings for semantic recall.
Here’s the setup:
import 'dotenv/config';
import { Memory } from 'mem0ai/oss';
import { OpenAI } from 'openai';
const client = new OpenAI();
const mem = new Memory({
version: 'v1.1',
enableGraph: true,
graphStore: {
provider: 'neo4j',
config: {
url: 'neo4j://localhost:7687',
username: 'neo4j',
password: 'your-password-here',
},
},
vectorStore: {
provider: 'qdrant',
config: {
collectionName: 'memories',
embeddingModelDims: 1536,
host: 'localhost',
port: 6333,
},
},
});
🧑💻 Breaking Down the Code
Memory Setup
We initialize
Memory
frommem0
withenableGraph: true
.This tells the system: store memories in both graph and vector stores.
Graph Store Config
Provider: Neo4j.
Connection string to
neo4j://
localhost:7687
.Username & password for login.
Vector Store Config
Provider: Qdrant.
Stores embeddings of each message.
Used for semantic recall.
💬 Chat Function Example
Now let’s see how we handle chat with relational memory:
async function chat(query = '') {
const memories = await mem.search(query, { userId: 'piyush' });
const memStr = memories.results.map((e) => e.memory).join('\n');
const SYSTEM_PROMPT = `
Context About User:
${memStr}
`;
const response = await client.chat.completions.create({
model: 'gpt-4.1-mini',
messages: [
{ role: 'system', content: SYSTEM_PROMPT },
{ role: 'user', content: query },
],
});
console.log(`Bot:`, response.choices[0].message.content);
await mem.add(
[
{ role: 'user', content: query },
{ role: 'assistant', content: response.choices[0].message.content },
],
{ userId: 'piyush' }
);
}
Step by Step:
Search: Looks up past memories related to the query.
System Prompt: Injects relevant context into GPT.
Generate: GPT responds with enriched context.
Add: Stores the new conversation into both vector + graph DB.
🧠 Why Relational Memory Matters
Think about how humans recall:
Not just “facts” but connections.
If I say “Tell me about your college friend who joined Google,” you instantly trace: Friend → College → Google.
That’s what relational memory does for AI.
Example Query in Hinglish
User: “Mujhe Ramesh ke dost ki company ke baare mein batao.”
Graph DB traces: Ramesh → Dost → Piyush → Works at ChaiCodeHQ.
Answer: “Ramesh ka dost Piyush, ChaiCodeHQ mein kaam karta hai.”
Without a graph DB, this query would likely fail.
📊 Neo4j Basics — Cypher Queries
A quick primer:
Nodes → Entities (Person, Company, Movie).
Edges → Relationships (KNOWS, WORKS_AT, ACTED_IN).
Cypher → Query language for Neo4j.
Examples:
// Find all friends of Aman
MATCH (a:Person {name: "Aman"})-[:KNOWS]->(friends)
RETURN friends;
// Find companies where Aman’s friends work
MATCH (a:Person {name: "Aman"})-[:KNOWS]->(p:Person)-[:WORKS_AT]->(c:Company)
RETURN c.name;
Cypher makes querying intuitive and visual, compared to writing complex SQL joins.
📚 Case Study: AI Travel Assistant
Imagine building an AI travel agent with relational memory:
User says: “Plan a trip for me with my cousin Rahul and his wife.”
Stored relationships:
- You → Cousin → Rahul → Married → Neha.
Graph DB helps trace → Rahul + Neha.
AI fetches preferences of Rahul & Neha from vector DB.
Output: A customized family trip plan.
This is where graph + vector memory shine together.
🌍 Future of Graph-Powered AI
Looking ahead, I see graph databases becoming the backbone of AI memory.
Personal AI agents: Know not just your chats, but your relationships, habits, and context.
Enterprise agents: Trace org charts, project dependencies, customer interactions.
Education: AI tutors can recall not just topics you studied, but how topics relate.
It’s like giving AI a mental map of the world, not just a bag of facts.
🚀 Reflection
Day 10 opened my eyes to the next layer of AI reasoning.
Vector DBs are great for recall.
Graph DBs are great for relationships.
Combined, they make agents that think more like humans.
Every class feels like adding another neuron to the AI brain 🧠.
Graph memory might just be the missing link to truly context-aware AI.
Subscribe to my newsletter
Read articles from Aman Kumar directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
