Algorithmic Complexity
The "complexity" of an algorithm describes how much time or memory (space) it takes to complete that task, as the size of the input increases.
Two main types of complexity:
Time complexity
Measures how the execution time of an algorithm grows as the input size increases. You usually express it with a "Big O" notation, which describes the worst-case scenario.
Time complexity is commonly estimated by counting the number of elementary operations performed by the algorithm, supposing that each elementary operation takes a fixed amount of time to perform.
O(1): Constant time – the algorithm takes the same amount of time no matter how big the input is.
- Example: Looking up a specific item in a small list.
O(log n): Logarithmic time – the time grows slowly even as the input increases. It happens when you can "cut the problem in half" each step.
- Example: Binary search in a sorted list.
O(n): Linear time – the time grows proportionally to the input size.
- Example: Looping through a list to find an item.
O(n²): Quadratic time – the time grows exponentially as input increases. It happens when you need to compare all items with each other.
- Example: Bubble sort or selection sort.
O(2^n): Exponential time – the time doubles with each additional input. This is extremely slow for large inputs.
- Example: Solving the Towers of Hanoi puzzle.
Space Complexity
Space complexity measures how much memory an algorithm uses while running. Like time complexity, it is also measured using "Big O" notation.
For example:
O(1): The algorithm uses a constant amount of memory, regardless of input size.
O(n): The memory usage grows linearly with the input size.
How to Calculate Complexity
Step 1: Identify basic operations (like comparisons or loops).
Step 2: Count how many times those operations run as the input grows.
Step 3: Use Big O notation to generalize the time or space complexity
for (let i = 0; i < n; i++) {
// some operation
}
This loop runs n
times, so its time complexity is O(n).
If you have nested loops (a loop inside a loop):
for (let i = 0; i < n; i++) {
for (let j = 0; j < n; j++) {
// some operation
}
}
Here, the outer loop runs n
times, and for each time it runs, the inner loop also runs n
times. So, the time complexity is O(n²).
How to Optimize Complexity
Use more efficient algorithms: Sometimes, there are better algorithms for the same task.
For example, instead of using bubble sort (O(n²)), you can use quicksort or merge sort (O(n log n)).
Avoid unnecessary computations: Cache results of expensive computations so you don't have to recompute them. This is called memoization.
Reduce the size of the input: If possible, reduce the input size using techniques like divide and conquer (splitting a big problem into smaller ones) or binary search.
Avoid deep nesting of loops: Try to simplify code with multiple nested loops (which can result in O(n²) or higher complexity). Rewriting the logic to have fewer loops can make it more efficient.
In Summary
Complexity describes how long an algorithm takes (time complexity) and how much space it uses (space complexity).
The goal is to make sure that as your input grows, your algorithm doesn't become too slow or use too much memory.
Big O notation is used to describe the "worst-case scenario" for time or space as the input grows.
Optimization involves choosing better algorithms, reducing input size, and avoiding unnecessary operation
Subscribe to my newsletter
Read articles from Riham Fayez directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by