Mastering LRU Cache in Python – With Real-Life Analogies & Clean Code!

📚 Introduction
Imagine you’re a librarian with limited shelf space. Every time a new book comes in, and there’s no space left, you remove the least recently read book to make room.
That’s exactly what an LRU (Least Recently Used) Cache does — and in this blog, you’ll learn how to implement one in Python using just a few lines of code.
🌀 What is an LRU Cache?
LRU Cache is a data structure that stores a limited number of items. When the cache is full and a new item needs to be added:
It removes the least recently used item (oldest access).
It helps save memory while keeping the most relevant data ready.
🛠️ Use Case in Real Life
Example: Browser Tabs
Think of browser tabs as a cache.
Your browser keeps a few websites active in memory.
If memory is full, the browser discards the tab you haven’t visited in the longest time.
This is LRU in action.
🔧 Let’s Build It in Python
We’ll use Python’s OrderedDict
, which keeps track of insertion order and makes LRU logic super clean.
✅ Code:
pythonCopyEditfrom collections import OrderedDict
class LRUCache:
def __init__(self, capacity: int):
self.cache = OrderedDict()
self.capacity = capacity
def get(self, key: int) -> int:
if key not in self.cache:
return -1
# Move to end to mark as recently used
self.cache.move_to_end(key)
return self.cache[key]
def put(self, key: int, value: int) -> None:
if key in self.cache:
# Update existing key
self.cache.move_to_end(key)
self.cache[key] = value
if len(self.cache) > self.capacity:
# Remove least recently used item (first key)
self.cache.popitem(last=False)
👨💻 Example Walkthrough
pythonCopyEditlru = LRUCache(2)
lru.put(1, 'A') # Cache: {1: 'A'}
lru.put(2, 'B') # Cache: {1: 'A', 2: 'B'}
print(lru.get(1)) # Access 1 → Cache: {2: 'B', 1: 'A'}, Output: A
lru.put(3, 'C') # Evicts 2 → Cache: {1: 'A', 3: 'C'}
print(lru.get(2)) # 2 was evicted → Output: -1
🧾 Output:
cssCopyEditA
-1
🎯 How It Works
Step | Action | Cache State |
put(1, 'A') | Add key 1 | {1: 'A'} |
put(2, 'B') | Add key 2 | {1: 'A', 2: 'B'} |
get(1) | Use key 1 | {2: 'B', 1: 'A'} |
put(3, 'C') | Evict LRU (2) | {1: 'A', 3: 'C'} |
get(2) | Not in cache | -1 |
✅ Why OrderedDict
?
move_to_end(key)
: Marks the key as most recently used.popitem(last=False)
: Removes the first inserted key (i.e., the least recently used).
🧠 Where is LRU Cache Used?
Web browsers (tab/page caching)
Database query caches
File systems
Image and video processing apps
Mobile app memory management
🏁 Conclusion
Building an LRU Cache in Python is surprisingly easy using OrderedDict
. It’s a must-know for coding interviews and useful in real-world applications like caching and memory optimization.
Start with simple examples, and try adding delete()
, or limit usage by time next.
Subscribe to my newsletter
Read articles from Akash Prajapati directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
