Time Complexity and Growth Rate: Understanding Equations and Performance

Amit KesarwaniAmit Kesarwani
6 min read

Introduction

In computer science, understanding how an algorithm performs is crucial. Time complexity helps us measure the efficiency of an algorithm by estimating how much time it takes to run as a function of the input size (n). This article breaks down types of time complexities, their mathematical forms, and why worst-case analysis is usually preferred.


What is Time Complexity?

Time Complexity is the amount of time an algorithm takes to run, depending on the input size. It's a way to analyze how the running time increases as the input grows.


Rate of Growth and Types of Equations

Different algorithms grow at different rates. Here are the most common types of time complexities with examples:

Time ComplexityNameExample EquationGrowth Rate
O(1)Constant TimeT(n) = 5Flat
O(log n)Logarithmic TimeT(n) = log nVery slow growth
O(n)Linear TimeT(n) = nStraight line
O(n log n)Linearithmic TimeT(n) = n log nCommon in sorting
O(n^2)Quadratic TimeT(n) = n^2Nested loops
O(n^3)Cubic TimeT(n) = n^33-level loops
O(2^n)Exponential TimeT(n) = 2^nExplodes fast
O(n!)Factorial TimeT(n) = n!Extremely slow

Most Dominating Factor

When analyzing equations like:

T(n) = 5n^2 + 3n + 8

The most dominating term is the one that grows the fastest as n increases. Here, n^2 dominates.

So:

T(n) = O(n^2)

We ignore constants and lower-order terms in Big-O.


Best Case, Average Case, Worst Case

CaseDescriptionExample (Linear Search)
Best CaseMinimum time required (ideal case)Element found at index 0
Average CaseExpected time on average inputsElement is somewhere in middle
Worst CaseMaximum time taken in the worst scenarioElement not present or at end

Why prefer Worst Case?

  • Guarantees performance regardless of input.

  • Safe for mission-critical applications.

  • Helps ensure the algorithm doesn't fail under pressure.


Time Complexity Notations

  1. Big-O (O) → Worst-case upper bound.

  2. Omega (Ω) → Best-case lower bound.

  3. Theta (Θ) → Average-case tight bound.


Graph: Time vs Input Size

  • X-axis: Input size n

  • Y-axis: Time taken T(n)

Growth (in order):

O(1) < O(log n) < O(n) < O(n log n) < O(n^2) < O(n^3) < O(2^n)

Code Examples

🔹 Constant Time – O(1)

int getFirstElement(int arr[]) {
    return arr[0];
}

🔹 Linear Time – O(n)

int main() {
    int a, b, c, d;   // Constant time → 4 steps
    a = b + c;        // Constant time → 2 steps
    cin >> n;         // Constant time → 1 step

    for(int i = 1; i < n; i++) {  // Loop runs (n - 1) times
        a++;                     // Each iteration → 1 step
    }

    return 0;          // Constant time → 1 step
}

🧠 Step-by-Step Time Complexity Analysis

  1. Constant Operations Before Loop

    • Variable declarations: 4 steps

    • Assignment: 2 steps

    • Input: 1 step
      🔸 Total: O(1)

  2. Loop Analysis

    • for(int i = 1; i < n; i++): loop header has 2 operations per iteration → 2 × (n - 1)

    • a++: runs (n - 1) times → 1 × (n - 1)
      🔸 Total: O(n)

  3. Return Statement

    • Single constant step → O(1)

🧮 Time Equation

f(n) = 4 (decl.) + 2 (assign.) + 1 (input) 
     + 2(n - 1) (loop header) + 1(n - 1) (a++) 
     + 1 (return)

f(n) = 4 + 2 + 1 + 2(n - 1) + (n - 1) + 1  
     = 3n + 7

Here, 3n is the dominant term, and +7 is negligible as n grows large.

🎯 Final Time Complexity

We write:

f(n) = O(n)   → Worst case
      = Ω(n)  → Best case
      = Θ(n)  → Average case

Hence, this program has linear time complexity: 👉 O(n)


🔹 Quadratic Time – O(n^2)

int main() {
    int a, b, c;                // → 3 steps

    for (int i = 0; i < n; i++)       // → 2n steps
    {
        for (int j = 1; j < n; j++)   // → 2n steps
        {
            a = b + c;               // → 2n^2 steps
        }
    }

    cout << a;               // → 1 step
    return 0;                // → 1 step
}

🧠 Step-by-Step Time Complexity Analysis

  1. Outer Loop: for(int i = 0; i < n; i++) → runs n times → header cost ≈ 2n

  2. Inner Loop: for(int j = 1; j < n; j++) → runs n times for each i → total ≈ 2n^2

  3. Inside Inner Loop: a = b + c; → basic operation ≈ 2n^2

  4. Other Statements:

    • Declarations → 3

    • cout << a1

    • return 01

🧮 Total Time Equation

f(n) = 3 + 4n^2 + 2n^2 + 2
     = 6n^2 + 5

🎯 Final Time Complexity

f(n) = O(n^2)

This shows quadratic growth as input size increases.


🔹 Logarithmic Time – O(log n)

int main() {
    int a, b, c, n;
    a = b + c;

    for(int i = 1; i <= n; i = i * 2) {
        b++;
    }
    return 0;
}

This loop runs for values: 1, 2, 4, 8, 16... up to n.

Each time i doubles, so the loop executes approximately log₂(n) times.

From the math:

2^k > n
k ≈ log₂(n)

🎯 Final Time Complexity

f(n) = O(log n)

🔹 Linearithmic Time – O(n log n)

int main() {
    int a, b, c, n;
    a = b + c;

    for (int i = 0; i < n; i = i + 2) {
        for (int j = 1; j < n; j = j * 2) {
            c++;
        }
    }
    return 0;
}
  • Outer loop: runs n/2 times (since i += 2) → O(n)

  • Inner loop: runs log n times (since j *= 2) → O(log n)

🎯 Final Time Complexity

O(n log n)

🔹 Cubic Time – O(n^3)

void printTriplets(int arr[], int n) {
    for(int i = 0; i < n; i++) {
        for(int j = 0; j < n; j++) {
            for(int k = 0; k < n; k++) {
                cout << arr[i] << "," << arr[j] << "," << arr[k] << endl;
            }
        }
    }
}

🔹 Exponential Time – O(2^n)

int fibonacci(int n) {
    if(n <= 1) return n;
    return fibonacci(n-1) + fibonacci(n-2);
}

Conclusion:-

Time complexity helps you analyze how efficiently your code performs. While short code may look good, understanding its performance for large inputs is critical. Prefer worst-case analysis to ensure reliability, and always be aware of the growth rate of your algorithm.

1
Subscribe to my newsletter

Read articles from Amit Kesarwani directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Amit Kesarwani
Amit Kesarwani