Demystifying Recursion: A Simple Guide

Mehar MayankMehar Mayank
4 min read

Recursion is a fundamental concept and a very interesting one, as it often appears a bit magical at first.

What Exactly is Recursion?

Recursion is a programming technique where a function calls itself. Think of it like a set of Russian nesting dolls, where each doll contains a smaller version of itself. In programming, a recursive function solves a problem by breaking it down into smaller, similar sub-problems until it reaches a basic case that can be solved directly.

Consider the simple code given below :

int main() { 
    cout<<"Hello");
    int fun() {
        return 0;
    }
    // func() is a function call.
}

Here, main calls fun. In recursion, fun would call fun itself.

Understanding Factorial with Recursion

Let's take the classic example of calculating the factorial of a number. The factorial of a non-negative integer 'n', denoted by n!, is the product of all positive integers less than or equal to 'n'. For example, 5! = 5x4x3x2x1.

We can observe a pattern:

  • 5! = 5x4!

  • 4! = 4x3!

  • 3! = 3x2!

  • 2! = 2x1!

  • 1! = 1×1

Notice how each factorial is defined in terms of a smaller factorial. This is the perfect scenario for recursion!

Iterative Approach (for comparison)

Before diving into recursive factorial, let's briefly look at how you might calculate factorial iteratively using a loop:

int fact(int n) {
    int f = 1;
    for (int i = 2; i <= n; i++) {
        f = f * i; // This multiplies f by i in each iteration
    }
    return f;
}

This iterative solution calculates factorial in O(n) time complexity.

Recursive Approach for Factorial

Now, let's translate the factorial pattern into a recursive function:

int fact(int n) {
    // Base Condition: When does the recursion stop?
    if (n == 1) {
        return 1; // The factorial of 1 is 1
    }
    // Recursive Step: How do we break down the problem?
    return n * fact(n - 1);
}

Let's trace fact(4):

  1. fact(4) calls 4 * fact(3)

  2. fact(3) calls 3 * fact(2)

  3. fact(2) calls 2 * fact(1)

  4. fact(1) hits the base condition (n == 1) and returns 1

  5. Now, the calls unwind:

    • fact(2) gets 1 from fact(1), so it returns 2 * 1 = 2

    • fact(3) gets 2 from fact(2), so it returns 3 * 2 = 6

    • fact(4) gets 6 from fact(3), so it returns 4 * 6 = 24

The time complexity for this recursive factorial is also O(n).

The Importance of a Base Condition

Every recursive function must have a Base Condition. This is the stopping point for the recursion. Without it, the function would call itself indefinitely, leading to a stack overflow error. Think of it as the smallest doll in the nesting doll set – it doesn't contain another doll.

Drawbacks of Recursion

While elegant, recursion isn't always the most efficient solution:

  1. More Memory: Each recursive call adds a new stack frame to the call stack. This can consume a significant amount of memory, especially for deep recursions.

  2. More Time: The overhead of function calls (pushing and popping stack frames) can make recursive solutions slower than their iterative counterparts in some cases.

Another Application: Fibonacci Series

The Fibonacci series is another classic example where recursion shines. The series starts with 0 and 1, and each subsequent number is the sum of the two preceding ones:

0 , 1 , 1 , 2 , 3 , 5 …..

The mathematical definition is:

  • F(0)=0

  • F(1)=1

  • F(n)=F(n−1)+F(n−2) for n1

Let's look at F(5):

F(5)=F(4)+F(3)

The recursive function for Fibonacci would look like this:

int fib(int n) {
    if (n == 0) {
        return 0;// Base condition 1
    }
    if (n == 1) {
        return 1;// Base condition 2
    }
    return fib(n - 1) + fib(n - 2);
}

However, the recursive Fibonacci solution has a significant drawback: its time complexity is O(2n). This is because many subproblems are re-calculated multiple times (e.g. fib(2) will be calculated multiple times when computing fib(5)). This leads to a lot of redundant work and is less efficient than an iterative approach or a recursive approach with memoization.

Where Recursion Shines

Despite the drawbacks, recursion is incredibly powerful and elegant for certain problems:

  • Tree and Graph Traversal: Algorithms like Depth-First Search (DFS) are naturally recursive.

  • Divide and Conquer Algorithms: Merge Sort, Quick Sort, etc., are inherently recursive.

  • Backtracking: Solving problems by trying different paths and undoing choices (like solving a maze or Sudoku).

  • Simplifying Complex Logic: Sometimes, a recursive solution can be much cleaner and easier to understand than an iterative one, even if it's less efficient.

0
Subscribe to my newsletter

Read articles from Mehar Mayank directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Mehar Mayank
Mehar Mayank