What is Rate of growth?

AbhinavAbhinav
5 min read

The rate of growth in time complexity refers to how the running time of an algorithm increases relative to the size of the input (usually denoted as n). It gives a way to compare the efficiency of different algorithms, especially for large inputs.


Some of the type are constant, linear, Quadratic, Cubic,Bi-quadratic , exponential , logarithm and many more.

What are the role of these equations in rate of Growth?

When we talk about any software we focus on the input and output of software. So, what will be the program for the software we need to find and then find its time complexity using the equations of the program.


Constant Time Complexity

Independent of input size, its always take the allotted value/time which is given for the execution. and it is denoted by 1 or O(1).

Example :

Assuming execution of 1 work is done in 1 second.

int main(){
    int x,y,z; // takes 3 seconds in total x,y and z
    z = x + y; // takes 2 seconds + and = 
    cout << z; // takes 1 second cout 
    return 0; // takes 1 second
}
// Total of 7 seconds this whole program will take.

Now considering the inputs are same ==> means addition of x and y and then assign to z is same and all other. By seeing this I can say the time is fixed. So , this is constant time complexity and It will be denoted as O(1).

And in this case best , average and worst case all are equal because nothing is changing. But we always denotes worst case by default.


Linear Time Complexity

By name you can understand it will be linear in nature. Lets jump into example to understand it more briefly.

Example:

Assuming execution of 1 work is done in 1 second.

int main(){
    int a,b,c,d,n; // takes 5 seconds for a,b,c,d and n intialise.
    a = b + c;  // takes 2 seconds for + and =
    cin >> n;  // takes 1 second for taking input 

    for(int i = 0; i < n; i++){ /* i = 0 will run 1 time and i<n will
                    run n times and same for i++ */
        a++; // n times it will run
    }
return 0; // takes 1 second
}

/* Total time will be equall to 3n + 10 or O(n) ==> as n increases
constant terms get ignored */

// single loop create linear time complexity.

By seeing above example we can say this is linear time complexity. And a conclusion where ever you see a single loop it will create linear time complexity which is denoted as O(n).


Quadratic Time Complexity

When there is n² comes into the picture. Lets jump into the example and see.

Example:

Assuming execution of 1 work is done in 1 second.

int main(){
    int a,b,c; // takes 3 seconds.
    for (int i = 0; i < n; i++){ // takes n as we discuss in linear time complexity.
        for (int j = 0; j < n; j++){ // take n * n
            a = b + c; // takes n * n 
            }
        }
cout << a; // takes 1 seconds
return 0 ; // takes 1 seconds 
}

// total time = O(1) + O(n) + O(n^2) + O(n^2) + O(1) + O(1) ==> O(n^2)

/* now how n * n is coming ?
lets start with inner loop as we know it will run n times first, after that it will
go into the outer loop which was firstly i = 0. Now it will point to i = 1 and again 
it will come inside the inner loop and run n times and this process will go on till
outer loop run upto n times. Hence, inner loop gets n crossponding value for single 
value of outer loop. n*n 
*/

Basically its Nested loop conditions where we are able to see the Quadratic time complexity. And denoted by O(n²).


Logarithmic Time Complexity

Denoted as O( Log(n) )

Example:

Assuming execution of 1 work is done in 1 second.

int main(){
    int a,b,c; // takes 3 seconds
    a = b + c; // takes 2 seconds
    for (int i = 1 ; i < n ; i = i*2){ /* we have to go till n but as we can see its
                            increaseing by multiple of 2. so after how many turns it 
                            exceed the value n
                                            */
                              // this can be solved by take k as the number of turns
                              // so 2^k = n 
                              // taking log both side we will get k = log n base 2.

        b++; // takes log n base 2 seconds
    } 
return 0; // takes 1 seconds 
}

// Total =  6 +  log n + log n ==> O(log n)

For you knowledge, exponential and logarithm are interrelated to each other. And also if in case of Multiplication if there is Division than also the Time complexity will be same as O(log n).

NOTE:

1 < log n < n < n*log n < n² < n³ < a^n < n!

This is the order of time complexity for the following equations.


Conclusion:

Every type of time complexity is discussed here with examples and I love making this blog. Learn and grow together.

0
Subscribe to my newsletter

Read articles from Abhinav directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Abhinav
Abhinav

I am B-Tech 2nd year Student (Electrical Engineering)