ALGORITHMS: Implementing Quick Sort Algorithm in Javascript


Introduction
Sorting algorithms are foundational in computer science. Sorting enables efficient data organization and retrieval. Quick sort is known for being efficient and is largely used due to its ability to handle large datasets effectively. Quick sort is an efficient sorting algorithm that leverages the divide-and-conquer strategy to sort elements. It works by choosing a pivot element from the array, and then partitioning the other elements into two subarrays according to whether they are less than or greater than the pivot. The pivot element ends up in its final sorted position, and the same process is recursively applied to both subarrays. This sorting approach leads to an average time complexity of O(n log n)
, faster than simpler sorting algorithms like Selection Sort in most real-world scenarios. However, its performance can degrade to O(n²)
in the worst case.
Prerequisites
To understand Divide and Conquer, you need a solid grasp of some fundamental programming concepts such as functions, loops, conditionals and basic array operations. Familiarity with recursion, including the concepts of base and recursive cases, is essential since Quick Sort heavily relies on recursive calls. Understanding the Divide and Conquer paradigm is also very important as it involves breaking problems into smaller easier-to-solve subproblems, solving them independently, and combining the results. Lastly, a good understanding of data structures, particularly arrays, will aid in understanding partitioning and recursive sorting operations.
How Quick Sort Algorithm Works
Pick a Pivot: This is a random element from the array that is to be sorted. You can either pick the first element in the array, the last element from the array or a random element from the array. It is advisable to pick a random element from the array because your choice of a pivot is going to affect the time complexity of the algorithm. This is the value used to partition the array.
Partitioning: In this step, the algorithm finds the elements smaller than the pivot and the elements larger than the pivot. We have 2 subarrays, one that stores the elements larger than the pivot and the other one that stores elements smaller than the pivot. We are rearranging the array such that all elements smaller than the pivot are placed to its left and all the larger elements to its right, ensuring the pivot is in its sorted position.
Recursive Sorting: In this step, the partition process is applied recursively to the left subarray (elements smaller than the pivot) and the right subarray (elements larger than the pivot). This step keeps breaking down the subarrays until you reach the base case (a subarray of length 1 or 0) and at this point, it is already sorted.
Combining: Quick Sort doesn’t explicitly combine subarrays because as the recursive calls return, the array becomes sorted because the pivot positions and subarrays are already in the correct order. The final step will look like this;
[...smaller + pivot + ...larger]
. So, the array becomes sorted as the recursion unwinds.
Algorithm implementation
Pseudocode for Quick sort Algorithm
FUNCTION quick_sort(array):
// Base case: arrays with 0 or 1 element are already sorted
IF length(array) <= 1:
RETURN array
// Choose the pivot (last element)
pivot = LAST element of array
left = EMPTY ARRAY
right = EMPTY ARRAY
FOR each element in array EXCEPT pivot:
IF element <= pivot:
ADD element TO left
ELSE:
ADD element TO right
RETURN quick_sort(left) + [pivot] + quick_sort(right)
Explanation
Base Case: if the array has no elements or just 1 element, it is already sorted.
Partitioning: Choose the last element as the pivot, then separate the elements into two arrays:
Left array: This will store elements smaller than or equal to the pivot.
Right array: This will store elements greater than the pivot.
Recursive Sorting: Recursively apply step 2 to the left and right arrays.
Combine: Concatenate the sorted left array, pivot, and sorted right array to form the final sorted array.
Quicksort Algorithm Implementation In JavaScript
const quicksort = (array) => {
// Base case: arrays with 0 or 1 element are already sorted
if (array.length <= 1) {
return array
}
// Choose the pivot (last element)
const pivot = array[array.length - 1];
const left = [];
const right = [];
for (let i = 0; i < array.length - 1; i++) {
if (array[i] <= pivot) {
left.push(array[i])
} else {
right.push(array[i])
}
}
return [...quicksort(left), pivot, ...quicksort(right)]
}
const data = [8, 3, 1, 7, 0, 10, 2];
console.log("Sorted Array:", quicksort(data)); // returns [0,1,2,3,7,8,10]
Visualizing Quick Sort
Example Array:
[8, 3, 1, 7, 0, 10, 2]
Step 1: Choose the pivot
- Select the last element as the pivot:
array[arrray.length - 1] = 2
Step 2: Partition the Array:
Elements less than or equal to the pivot will be stored in the left array while the elements greater than the pivot will go to the right.
Left: [1, 0]
Right: [8, 3, 7, 10]
Pivot:
2
Now the array becomes:[1, 0] [2] [8, 3, 7, 10]
Step 3: Recursively Sort the left and right Subarrays
Sorting the left subarray [1, 0]
Pivot:
0
Partitioning results in:
Left: [] | Right: [1]
Sorted left subarray:
[...left, pivot, ...right] = [0, 1]
Sorting the right subarray [8, 3, 7, 10]
Pivot:
10
Partitioning results in:
Left: [8, 3, 7] | Right: []
Pivot:
7
Partitioning results in:
Left: [3] | Right: [8]
Sorted right subarray:
[3, 7, 8, 10]
Final Combination
when we combine all sorted subarrays, we get:
[0, 1] [2] [3, 7, 8, 10]
Sorted Result:
[0, 1, 2, 3, 7, 8, 10]
Time and Space Complexity Of Quick Sort
Time Complexity
When it comes to the time and space complexity of the quick sort algorithm, your choice of pivot really plays a major role.
Best and Average Cases
O(n log n)
: The best or average case is achieved when the pivot divides the array into two balanced partitions of equal or nearly equal size. In this case, each partition requires log n levels of recursion (this is because the partitions will either be equal or nearly equal), and at each level, the partitioning process takesO(n)
time. Hence, the overall time complexity becomesO(n log n)
.Worst Case
O(n²)
: The worst case happens when the pivot choice is poor (an example of a poor pivot choice is when you select the largest or smallest element as the pivot), resulting in highly unbalanced partitions. In the best and average case, the algorithm makes n recursive calls with log n levels of recursion. In this case, the algorithm requires n levels of recursion and the partition process also takesO(n)
time (this is because of the unbalanced partition). Hence, the overall time complexity becomesO(n²)
.
Space Complexity
Best and Average Cases: It takes
O(log n)
for the recursion stack in the best and average cases (when the partitions are balanced).Worst Case: It takes
O(n)
in the worst case (i.e. when the partitions are unbalanced).
Advantages of Quick Sort
In-Place Sorting: Since quick sort does not need an extra space for merging, it only requires minimum additional memory unlike algorithms like merge sort.
Practical Performance: Being an in-place sorting algorithm, quick sort minimizes memory usage and also benefits from better cache performance because it often processes elements stored close together in the memory. Quick sort sorts the elements within the original array without an additional memory allocation.
Fast and Efficient: Quick Sort performs well on large datasets due to its average case time complexity of
O(n log n)
.
Disadvantages of Quick Sort
Poor Pivot Choice: As I mentioned earlier, your pivot choice affects the overall efficiency of the quick sort algorithm. A poor pivot choice (selecting the smallest or largest element as the pivot choice) will result in
O(n²)
time complexity.Risk of Stack Overflow: Quick sort has a recursive nature and it may encounter stack overflow error for large datasets if the recursion depth is not optimized.
Optimizing Quick Sort Algorithm
The quick sort algorithm can be optimized by:
Choosing a Good Pivot: This can be achieved by:
Selecting a random pivot: This reduces the likelihood of having the worst-case performance, because it avoids predictable patterns.
Selecting the median of three elements: This is done by selecting the median of the first, middle and last element of the array. This helps avoid the worst-case scenario when the array is already sorted or nearly sorted.
Tail Recursion Optimization: The whole idea here is to always recursively sort the smaller partition and iterate over the larger partition. This optimization technique reduces the risk of stack overflow, improving the efficiency of the algorithm.
Real-world Applications of Quick Sort
Database Query Optimization: Quick Sort is often used in databases to efficiently sort large datasets, particularly for queries requiring ordered results.
Game Development: Quick Sort is useful in games for sorting leaderboards by high scores or other performance metrics in real-time due to its speed and efficiency.
Networking: Quick Sort can be used in routing algorithms or network data analysis to efficiently sort packets or prioritize traffic based on criteria like response time or packet size.
Libraries and Frameworks: Quick sort is largely used in many libraries and frameworks due to its efficiency and speed in practice.
Conclusion
Quick sort is an efficient sorting algorithm that uses the Divide and Conquer paradigm to solve sorting problems effectively. Its ability to handle large datasets efficiently and sort data without needing extra memory makes it a popular choice in practical applications. Despite its potential worst-case time complexity, optimizations like selecting a better pivot can enhance its performance.
To understand how Quick Sort works, try implementing it yourself and experimenting with how it performs with different pivot choices. Understanding how quick sort breaks down problems and sorts recursively will help you better appreciate this algorithm.
Subscribe to my newsletter
Read articles from Nwosu Promise Okenna directly inside your inbox. Subscribe to the newsletter, and don't miss out.
Written by
