Time and Space Complexity of Recursion (Part 2)

In Part 1, we talked about loops, Big-O notation, and how to analyze the performance of basic code. Now, it’s time to talk about something that makes many DSA beginners sweat a little in their interviews, i.e., recursion. This is one of the favourite topics of tech giants, so obviously, it’s IMPORTANT.
We always would have stuck with these kinds of questions:
How do I calculate time complexity for recursive functions?
Why is Fibonacci considered inefficient in a recursive way?
How does Merge Sort stay so fast, though it follows a recursive path?
You’re in the right place. This post is all about understanding recursion, not just how it works, but how to break down its time and space complexity.
What is Recursion?
Many of us would already be aware of what recursion is, but just a recap for beginners. Recursion is when a function calls itself to solve a smaller version of the same problem.
To make recursion work properly, we always need two key parts: a base case, which tells the function when to stop, and a recursive case, which describes when to keep going.
Think of it like zooming in on a problem repeatedly until you finally reach a point where the solution is so simple, it’s obvious. Once that base case is hit, the function can start unwinding and combining all the smaller answers into the final solution.
Now, let’s move on to the main topic. We’ll start by computing the time and space complexity for factorial code, which is a basic and good example of recursion.
Example 1: Factorial
The logic for factorial is as follows:
n! = n × (n — 1)!
Jumping into the code, it can be implemented as:
int factorial(int n) {
if (n == 0)
return 1;
return n * factorial(n - 1);
}
Now, let’s say we call factorial(5). It will lead to other factorial calls:
→ factorial(5)
→ factorial(4)
→ factorial(3)
→ …and so on until factorial(0)
We’re making n + 1 calls, each doing one multiplication, so,
Time Complexity = O(n)
Moving on further, each recursive call sits on the call stack until the one below it finishes. In our example, for factorial(5), it’s six calls sitting on the stack, so,
Space Complexity = O(n)
Hope we are clear with the time and space complexity calculation for factorial. This is a good example of understanding how we compute complexities in recursive code.
It will become clearer in further sections. Let’s come to our next example, i.e., the Fibonacci series, another good example of recursion.
Example 2: Fibonacci
This question is a good interview trap. It’s a go-to example in interviews and for good reason. It looks innocent, but hides an exponential surprise.
The logic for factorial is as follows:
F(n)=F(n−1)+F(n−2)
With base cases: F(0)=0, F(1)=1
The code implementation is quite simple:
int fib(int n) {
if (n <= 1)
return n;
return fib(n - 1) + fib(n - 2);
}
Let’s break down for fib(4):
fib(4)
/ \
fib(3) fib(2)
/ \ / \
fib(2) fib(1) fib(1) fib(0)
/ \
fib(1) fib(0)
We see that there are repeated calls to fib(3), fib(2), and so on. Each call branches into two more. It grows fast.
In fact, if we notice, the total number of calls is about 2ⁿ. So,
Time Complexity = O(2ⁿ)
This time complexity is very slow for larger values of ’n’.
That’s why big tech companies look for more efficient solutions that can reduce the time complexity of problems like this, even when the logic starts with recursion.
Talking about space complexity, at any given point, the deepest path (from top to bottom) is n calls deep, so,
Space Complexity = O(n)
Just like time complexity, we can also optimize space complexity using Dynamic Programming. The complexity analysis for DP problems will be discussed in Part 3. Stay tuned!
We’ve seen simple recursion and even a tricky one that eats up time like crazy. But what if I say there’s a recursive algorithm that’s both elegant and efficient? That’s our final stop today: Merge Sort.
Example 3: Merge Sort
Merge Sort is a classic example of the divide and conquer strategy. Instead of tackling the whole problem at once, it:
Divides the array into two halves
Conquers each half by sorting them recursively
Combines the sorted halves into a single sorted array
This method ensures that no matter how messy the input is, each merge step is constantly dealing with sorted chunks, making the whole process smooth and efficient.
Let’s see a visual using an example: [5, 3, 8, 4, 2]
[5, 3, 8, 4, 2]
/ \
[5, 3, 8] [4, 2]
/ \ / \
[5, 3] [8] [4] [2]
/ \
[5] [3]
After sorting and merging:
[5] + [3] → [3, 5]
[3, 5] + [8] → [3, 5, 8]
[4] + [2] → [2, 4]
[3, 5, 8] + [2, 4] → [2, 3, 4, 5, 8]
We’ve broken down the concept and walked through the visual. Let’s turn that logic into clean, working code.
void mergeSort(int[] arr, int left, int right) {
if (left < right) {
int mid = left + (right - left) / 2;
mergeSort(arr, left, mid);
mergeSort(arr, mid + 1, right);
merge(arr, left, mid, right);
}
}
Here’s how it works:
Split the array in half (recursively)
Sort the left and right halves
Merge them back together
Think of Merge Sort like a decision tree.
First, it keeps splitting the array into halves — each split adds a new level. You end up with about log₂(n) levels of splitting.
At each level, we still need to touch all n elements to merge them back together.
Put that together, and we will get the time complexity.
Time Complexity = O(n logn)
Let’s discuss space. The merge step needs a temporary array to hold elements, so we require O(n) space for this step. Additionally, the recursive calls stack up about log₂(n) levels deep, so we need (log n) extra stack space.
When we combine them, the big spender here is the temporary array, so,
Space Complexity = O(n)
How to Analyse Recursive Code
We have gone through the basic patterns of recursive code, though there might be some level of confusion on what approach to follow. Here is a checklist to analyze the recursive code. Ask these questions to come to an answer.
How many times does the function call itself?
- One call = linear, two calls = exponential, etc.What work happens in each call?
- A loop? A simple operation?How deep do the calls go?
- That’s your stack space.Any extra memory being used?
- Like arrays or hash maps? Add that to space complexity.
What’s Coming in Part 3?
In Part 3, we’ll explore how to turn slow recursive solutions into fast and optimized ones using Dynamic Programming.
We’ll discuss some real-world problems like Climbing Stairs, House Robber, and more.
If this post helped you understand recursion better, share it further and subscribe to my newsletter.
Subscribe to my newsletter
Read articles from Neetika Khandelwal directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by

Neetika Khandelwal
Neetika Khandelwal
Hi, I’m Neetika — a Technical Content Writer with 3+ years of experience crafting developer-focused blogs on Java, DSA, Machine Learning, AI, and Web Development. I love simplifying complex concepts into beginner-friendly, SEO-optimized content that educates, inspires, and ranks. I’ve written for top tech platforms like GeeksforGeeks, Medium, and Tutorialspoint. Whether it's a deep-dive tutorial, hands-on coding guide, or interview-focused content — I aim to make learning enjoyable and practical.