DSA-Introduction: Part 6

What is Linear Time Complexity – O(n)?

Linear time complexity, denoted as O(n), means that the execution time of an algorithm increases in direct proportion to the size of the input.

If the input doubles, the time taken by the algorithm will also roughly double.


Why is it Called "Linear"?

In linear time, every element in the input must be processed individually, one after the other — like reading a list from start to finish.

  • One item = one operation

  • Ten items = ten operations

  • A thousand items = a thousand operations

This is called linear growth, and it's one of the most common time complexities you'll encounter when dealing with arrays, lists, or loops.


When Does O(n) Happen?

You typically get linear time complexity in situations like:

  • Scanning through all elements

  • Searching linearly

  • Performing an action once per input item

In all of these cases, the amount of work grows directly with the size of the input.


Big-O Focus

  • Best Case (Ω): May vary depending on logic

  • Average Case (Θ): Typically O(n)

  • Worst Case (O): Always proportional to input size n


Key Takeaways:

  • Linear = One step per item

  • Grows proportionally with input size

  • Common in loops and array processing

  • Easy to understand and predictable


Code Summary:

int main()
{
    int a, b, c, d, n;         // Variable declarations
    a = b + c;                 // Simple addition (constant time)
    cin >> n;                  // User inputs a number
    for (int i = 0; i < n; i++) {
        a++;                   // Loop runs n times
    }
    return 0;                  // End of program
}

Step-by-Step Explanation:

  1. Variable Declarations

     int a, b, c, d, n;
    

    ➤ This is a constant-time operation. Takes the same time, no matter what.

  2. Simple Addition

     a = b + c;
    

    ➤ Also a constant-time step (O(1)).

  3. Input from User

     cin >> n;
    

    ➤ Still a constant-time action. Just reads one number.

  4. Loop Starts Here

     for (int i = 0; i < n; i++) {
         a++;
     }
    

    ➤ This is the most important part:

    • The loop runs n times, depending on the value of n entered by the user.

    • Each loop performs a simple operation: a++ (which is O(1))

    • So doing a++ n times means the total time becomes O(n)

  5. Return Statement

     return 0;
    

    ➤ Just ends the program. Constant time.


Final Time Complexity Analysis:

Line of CodeDescriptionTime Taken
int a, b, c, d, n;5 variable declarations5 sec
a = b + c;1 addition + 1 assignment2 sec
cin >> n;User input1 sec
for (int i = 0; i < n; i++)Loop: init + n comparisons + n increments2n + 1 sec
{ a++; } (runs inside the loop)Runs n timesn sec
return 0;Exit the program1 sec
Total Time3n + 10 seconds

Conclusion:

Time Complexity = 3n + 10
As n grows large, constant values are ignored ⇒ O(n)

This program has linear time complexity, because the for loop runs exactly n times, and each iteration does a constant-time operation (a++).

So overall, time increases linearly with n, which is the input size.

Example 2:

C++ Code Example

#include <iostream>
using namespace std;

int main() {
    int n;

    // First loop runs n/3 times
    for (int i = 0; i < n / 3; i++) {
        cout << "chai code";
    }

    // Second loop runs n/2 times
    for (int j = 0; j < n; j = j + 2) {
        cout << "Hello";
    }

    return 0;
}

Time Complexity Table

Line of CodeDescriptionOperation Count EstimateTime Taken (assuming 1 sec per op)
int n;Declaration of variable1 operation1 sec
for (int i = 0; i < n / 3; i++)First loop: init (1), compare + increment per iteration~n/3 iterationsn/3 comparisons + n/3 increments = 2n/3 sec
cout << "chai code";Inside first loopExecuted n/3 timesn/3 sec
for (int j = 0; j < n; j = j + 2)Second loop: init (1), compare + increment per iteration~n/2 iterationsn/2 comparisons + n/2 increments = n sec
cout << "Hello";Inside second loopExecuted n/2 timesn/2 sec
return 0;Exit statement1 operation1 sec

Total Execution Time:

Summing up all operations:

= Variable declaration     → 1
+ First loop control       → 2n/3
+ First loop body          → n/3
+ Second loop control      → n
+ Second loop body         → n/2
+ Return statement         → 1
----------------------------------------
Total = (2n/3 + n/3 + n + n/2) + 2
      = n + n + n/2 + 2
      = 2.5n + 2

Final Time Complexity

In Big-O, we ignore constants and coefficients:

Time Complexity = O(n)


Here’s a zoomed-out linear graph showing how time grows up to n = 100.

What It Shows:

  • The line continues to rise at a steady, straight rate.

  • This means: the bigger the input, the more time it takes — but the growth is predictable and not sudden.

  • It perfectly represents O(n) behavior.

Missed the earlier parts?
Make sure to check out the previous posts in this series for a complete understanding of time complexity and algorithm analysis:

1
Subscribe to my newsletter

Read articles from CHIRANJEEVI DRONAMRAJU directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

CHIRANJEEVI DRONAMRAJU
CHIRANJEEVI DRONAMRAJU