Time Complexity and Growth Rate: Understanding Equations and Performance


Introduction
In computer science, understanding how an algorithm performs is crucial. Time complexity helps us measure the efficiency of an algorithm by estimating how much time it takes to run as a function of the input size (n). This article breaks down types of time complexities, their mathematical forms, and why worst-case analysis is usually preferred.
What is Time Complexity?
Time Complexity is the amount of time an algorithm takes to run, depending on the input size. It's a way to analyze how the running time increases as the input grows.
Rate of Growth and Types of Equations
Different algorithms grow at different rates. Here are the most common types of time complexities with examples:
Time Complexity | Name | Example Equation | Growth Rate |
O(1) | Constant Time | T(n) = 5 | Flat |
O(log n) | Logarithmic Time | T(n) = log n | Very slow growth |
O(n) | Linear Time | T(n) = n | Straight line |
O(n log n) | Linearithmic Time | T(n) = n log n | Common in sorting |
O(n^2) | Quadratic Time | T(n) = n^2 | Nested loops |
O(n^3) | Cubic Time | T(n) = n^3 | 3-level loops |
O(2^n) | Exponential Time | T(n) = 2^n | Explodes fast |
O(n!) | Factorial Time | T(n) = n! | Extremely slow |
Most Dominating Factor
When analyzing equations like:
T(n) = 5n^2 + 3n + 8
The most dominating term is the one that grows the fastest as n
increases. Here, n^2
dominates.
So:
T(n) = O(n^2)
We ignore constants and lower-order terms in Big-O.
Best Case, Average Case, Worst Case
Case | Description | Example (Linear Search) |
Best Case | Minimum time required (ideal case) | Element found at index 0 |
Average Case | Expected time on average inputs | Element is somewhere in middle |
Worst Case | Maximum time taken in the worst scenario | Element not present or at end |
Why prefer Worst Case?
Guarantees performance regardless of input.
Safe for mission-critical applications.
Helps ensure the algorithm doesn't fail under pressure.
Time Complexity Notations
Big-O (O) → Worst-case upper bound.
Omega (Ω) → Best-case lower bound.
Theta (Θ) → Average-case tight bound.
Graph: Time vs Input Size
X-axis: Input size
n
Y-axis: Time taken
T(n)
Growth (in order):
O(1) < O(log n) < O(n) < O(n log n) < O(n^2) < O(n^3) < O(2^n)
Code Examples
🔹 Constant Time – O(1)
int getFirstElement(int arr[]) {
return arr[0];
}
🔹 Linear Time – O(n)
int main() {
int a, b, c, d; // Constant time → 4 steps
a = b + c; // Constant time → 2 steps
cin >> n; // Constant time → 1 step
for(int i = 1; i < n; i++) { // Loop runs (n - 1) times
a++; // Each iteration → 1 step
}
return 0; // Constant time → 1 step
}
🧠 Step-by-Step Time Complexity Analysis
Constant Operations Before Loop
Variable declarations: 4 steps
Assignment: 2 steps
Input: 1 step
🔸 Total:O(1)
Loop Analysis
for(int i = 1; i < n; i++)
: loop header has 2 operations per iteration →2 × (n - 1)
a++
: runs(n - 1)
times →1 × (n - 1)
🔸 Total:O(n)
Return Statement
- Single constant step →
O(1)
- Single constant step →
🧮 Time Equation
f(n) = 4 (decl.) + 2 (assign.) + 1 (input)
+ 2(n - 1) (loop header) + 1(n - 1) (a++)
+ 1 (return)
f(n) = 4 + 2 + 1 + 2(n - 1) + (n - 1) + 1
= 3n + 7
Here, 3n
is the dominant term, and +7
is negligible as n
grows large.
🎯 Final Time Complexity
We write:
f(n) = O(n) → Worst case
= Ω(n) → Best case
= Θ(n) → Average case
Hence, this program has linear time complexity: 👉 O(n)
🔹 Quadratic Time – O(n^2)
int main() {
int a, b, c; // → 3 steps
for (int i = 0; i < n; i++) // → 2n steps
{
for (int j = 1; j < n; j++) // → 2n steps
{
a = b + c; // → 2n^2 steps
}
}
cout << a; // → 1 step
return 0; // → 1 step
}
🧠 Step-by-Step Time Complexity Analysis
Outer Loop:
for(int i = 0; i < n; i++)
→ runsn
times → header cost ≈2n
Inner Loop:
for(int j = 1; j < n; j++)
→ runsn
times for eachi
→ total ≈2n^2
Inside Inner Loop:
a = b + c;
→ basic operation ≈2n^2
Other Statements:
Declarations →
3
cout << a
→1
return 0
→1
🧮 Total Time Equation
f(n) = 3 + 4n^2 + 2n^2 + 2
= 6n^2 + 5
🎯 Final Time Complexity
f(n) = O(n^2)
This shows quadratic growth as input size increases.
🔹 Logarithmic Time – O(log n)
int main() {
int a, b, c, n;
a = b + c;
for(int i = 1; i <= n; i = i * 2) {
b++;
}
return 0;
}
This loop runs for values: 1, 2, 4, 8, 16... up to n.
Each time i
doubles, so the loop executes approximately log₂(n)
times.
From the math:
2^k > n
k ≈ log₂(n)
🎯 Final Time Complexity
f(n) = O(log n)
🔹 Linearithmic Time – O(n log n)
int main() {
int a, b, c, n;
a = b + c;
for (int i = 0; i < n; i = i + 2) {
for (int j = 1; j < n; j = j * 2) {
c++;
}
}
return 0;
}
Outer loop: runs
n/2
times (sincei += 2
) →O(n)
Inner loop: runs
log n
times (sincej *= 2
) →O(log n)
🎯 Final Time Complexity
O(n log n)
🔹 Cubic Time – O(n^3)
void printTriplets(int arr[], int n) {
for(int i = 0; i < n; i++) {
for(int j = 0; j < n; j++) {
for(int k = 0; k < n; k++) {
cout << arr[i] << "," << arr[j] << "," << arr[k] << endl;
}
}
}
}
🔹 Exponential Time – O(2^n)
int fibonacci(int n) {
if(n <= 1) return n;
return fibonacci(n-1) + fibonacci(n-2);
}
Conclusion:-
Time complexity helps you analyze how efficiently your code performs. While short code may look good, understanding its performance for large inputs is critical. Prefer worst-case analysis to ensure reliability, and always be aware of the growth rate of your algorithm.
Subscribe to my newsletter
Read articles from Amit Kesarwani directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
