Recursion (DSA - 9)

Madhav GanesanMadhav Ganesan
8 min read

Recursion is a fundamental programming concept where a function calls itself until a specified base condition is met. While it is a powerful tool for solving problems, it's essential to understand the various time complexities associated with recursive solutions and how to optimize them.

What is Recursion?

Recursion occurs when a function calls itself directly or indirectly to solve a problem. Each recursive call should bring the problem closer to a base case, which stops the recursion.

Image description

Image description

Base Condition

The base condition is a crucial part of a recursive function. It defines the scenario under which the function stops calling itself, preventing infinite loops and potential stack overflows.

Trick to find base condition:

  1. Consider there is only one element in the array (i.e) the first element

  2. Think how the value should be handled

Parameterized Recursion

// Sum of first N numbers
#include <iostream>
using namespace std;

int nsum(int n, int sum) {
    if (n < 1) {
        return sum;
    }
    return nsum(n - 1, sum + n);
}

int main() {
    int value = nsum(4, 0);
    cout << value << endl; // Output: 10
    return 0;
}

Functional Recursion

// Sum of first N numbers
#include <iostream>
using namespace std;

int nsum(int n) {
    if (n == 0) {
        return 0;
    }
    return n + nsum(n - 1);
}

int main() {
    int value = nsum(4);
    cout << value << endl; // Output: 10
    return 0;
}

Time Complexity in Recursion

Exponential Time Complexity

Certain recursive algorithms, such as the naive Fibonacci sequence, exhibit exponential time complexity due to the extensive number of function calls.

// Exponential Time Complexity: O(2^n)
int fibonacciRecursive(int n) {
    if (n <= 1) return n;
    return fibonacciRecursive(n - 1) + fibonacciRecursive(n - 2);
}

// Time Complexity: O(2^n)
// Space Complexity: O(n) (due to recursion stack)

Polynomial Time Complexity

By using optimization techniques like memoization, we can reduce the time complexity of recursive algorithms to polynomial time.

#include <iostream>
#include <vector>
using namespace std;

// Polynomial Time Complexity: O(n)
int fibonacciMemoized(int n, vector<int>& memo) {
    if (n <= 1) return n;
    if (memo[n] != -1) return memo[n];
    memo[n] = fibonacciMemoized(n - 1, memo) + fibonacciMemoized(n - 2, memo);
    return memo[n];
}

int main() {
    int n = 10;
    vector<int> memo(n + 1, -1);
    cout << "Fibonacci(" << n << ") = " << fibonacciMemoized(n, memo) << endl;
    return 0;
}

Logarithmic Time Complexity

Binary search is an example of an algorithm with logarithmic time complexity.

#include <iostream>
#include <vector>
using namespace std;

// Logarithmic Time Complexity: O(log n)
int binarySearch(vector<int>& arr, int left, int right, int target) {
    if (left > right) return -1;
    int mid = left + (right - left) / 2;
    if (arr[mid] == target) return mid;
    if (arr[mid] > target) return binarySearch(arr, left, mid - 1, target);
    return binarySearch(arr, mid + 1, right, target);
}

// Time Complexity: O(log n)
// Space Complexity: O(log n) (due to recursion stack)

Linear Time Complexity

Depth-First Search (DFS) on a graph is an example of a linear time complexity algorithm.

#include <iostream>
#include <vector>
using namespace std;

// Linear Time Complexity: O(V + E)
void dfs(vector<vector<int>>& graph, int node, vector<bool>& visited) {
    visited[node] = true;
    for (int neighbor : graph[node]) {
        if (!visited[neighbor]) {
            dfs(graph, neighbor, visited);
        }
    }
}

// Time Complexity: O(V + E)
// Space Complexity: O(V) (for the recursion stack and visited array)

Is Recursion Always Exponential?

No, recursion is not always exponential. The time complexity of recursion depends on the problem and how the recursion is implemented. For example, binary search has logarithmic complexity, while naive Fibonacci has exponential complexity. Optimization techniques like memoization, dynamic programming, and divide-and-conquer can significantly improve performance and reduce time complexity.

Recursion and Subsequences

Recursion is often used for subsequence-related questions because it naturally handles the branching required to generate all possible subsequences. However, iterative approaches or dynamic programming might be more efficient in certain cases.

Segmentation Fault and Stack Overflow Recursion, if not properly managed, can lead to a stack overflow or segmentation fault. These errors occur when the recursion depth exceeds the stack size limit, leading to numerous function calls waiting due to recursion.

Segmentation Fault A segmentation fault happens when a program tries to access a memory location that it is not allowed to access. In the context of recursion, this can happen if the recursive function calls exceed the stack memory allocated.

Stack Overflow A stack overflow occurs when the call stack pointer exceeds the stack bound. This is common in recursive functions without a proper base condition or with very deep recursion.

How to find whether a problem can be solved in dynamic programming?

  1. If problem question statements are like "Count all possible ways" or "find the best one"

  2. Try to represent the problem in terms on index

  3. Do all possible stuffs on the index according to the problem statement

Different types of Optimizations for Recursion problems:

1. Memoization (Top-Down)

Store results of expensive function calls and reuse them when the same inputs occur again. This can be done there is overlapping subproblems

Example: Fibonacci sequence

#include <iostream>
#include <vector>

using namespace std;

int fibonacciMemoized(int n, vector<int>& memo) {
    if (n <= 1) return n;
    // 3) If statement
    if (memo[n] != -1) return memo[n];
    //2) Store the value in a vector
    memo[n] = fibonacciMemoized(n - 1, memo) + fibonacciMemoized(n - 2, memo);
    return memo[n];
}

int main() {
    int n = 10;
    //1) Declare a vector of n+1 size
    vector<int> memo(n + 1, -1);
    cout << "Fibonacci(" << n << ") = " << fibonacciMemoized(n, memo) << endl;
    return 0;
}

Time Complexity: O(n) Space Complexity: O(n) + O(n)

2. Dynamic Programming (Bottom-Up)

Convert the recursive solution to an iterative one, building up solutions to subproblems iteratively.

Example: Fibonacci sequence

#include <iostream>
#include <vector>

using namespace std;

int fibonacciDP(int n) {
    if (n <= 1) return n;
    vector<int> dp(n + 1);
    dp[0] = 0;
    dp[1] = 1;
    for (int i = 2; i <= n; ++i) {
        dp[i] = dp[i - 1] + dp[i - 2];
    }
    return dp[n];
}

int main() {
    int n = 10;
    cout << "Fibonacci(" << n << ") = " << fibonacciDP(n) << endl;
    return 0;
}

Time Complexity: O(n) Space Complexity: O(n)

3. Space Optimized code

Example: Fibonacci Series

#include <iostream>

using namespace std;

void fibonacciSequence(int n) {
    if (n <= 0) return;
    if (n == 1) {
        cout << "0" << endl;
        return;
    }

    int a = 0, b = 1, c;
    cout << a << " " << b;

    for (int i = 2; i < n; ++i) {
        c = a + b;
        cout << " " << c;
        a = b;
        b = c;
    }

    cout << endl;
}

int main() {
    int n;
    cout << "Enter the number of terms: ";
    cin >> n;
    fibonacciSequence(n);
    return 0;
}

Time Complexity: O(n) Space Complexity: O(1)

4. Divide and Conquer

Break the problem into subproblems, solve each subproblem recursively, and combine their solutions to solve the original problem.

Example: Merge Sort

#include <iostream>
#include <vector>

using namespace std;

void merge(vector<int>& arr, int l, int m, int r) {
    int n1 = m - l + 1;
    int n2 = r - m;
    vector<int> L(n1), R(n2);
    for (int i = 0; i < n1; ++i) L[i] = arr[l + i];
    for (int i = 0; i < n2; ++i) R[i] = arr[m + 1 + i];
    int i = 0, j = 0, k = l;
    while (i < n1 && j < n2) {
        if (L[i] <= R[j]) arr[k++] = L[i++];
        else arr[k++] = R[j++];
    }
    while (i < n1) arr[k++] = L[i++];
    while (j < n2) arr[k++] = R[j++];
}

void mergeSort(vector<int>& arr, int l, int r) {
    if (l >= r) return;
    int m = l + (r - l) / 2;
    mergeSort(arr, l, m);
    mergeSort(arr, m + 1, r);
    merge(arr, l, m, r);
}

int main() {
    vector<int> arr = {38, 27, 43, 3, 9, 82, 10};
    mergeSort(arr, 0, arr.size() - 1);
    for (int x : arr) cout << x << " ";
    cout << endl;
    return 0;
}

Time Complexity: O(n log n) Space Complexity: O(n)

Take/Not Take

This technique is used to print all subsequences House Robber

    int rec(vector<int> nums, int n,vector<int> &dp){
        if(n==0) return nums[0];
        if(n<0) return 0;

        if(dp[n]!=-1)return dp[n];

        int take= nums[n]+rec(nums,n-2,dp);
        int nottake= 0+rec(nums,n-1,dp);
        return dp[n]=max(take,nottake);
    }
    int rob(vector<int>& nums) {
        vector<int> dp(nums.size(),-1);
        return rec(nums,nums.size()-1,dp);
    }

KnapSack

int knapSack(int W, int wt[], int val[], int n)
{

    // Base Case
    if (n == 0 || W == 0)
        return 0;

    // If weight of the nth item is more
    // than Knapsack capacity W, then
    // this item cannot be included
    // in the optimal solution
    if (wt[n - 1] > W)
        return knapSack(W, wt, val, n - 1);

    // Return the maximum of two cases:
    // (1) nth item included
    // (2) not included
    else
        return max(
            val[n - 1]
                + knapSack(W - wt[n - 1], wt, val, n - 1),
            knapSack(W, wt, val, n - 1));
}

Stay Connected! If you enjoyed this post, don’t forget to follow me on social media for more updates and insights:

Twitter: madhavganesan

Instagram: madhavganesan

LinkedIn: madhavganesan

0
Subscribe to my newsletter

Read articles from Madhav Ganesan directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Madhav Ganesan
Madhav Ganesan

I am a computer science student passionate about web development and any other technologies that amazes me