Understanding Time and Space Complexity (Part 1)

Imagine you appear for an interview with the top giants. You have been asked a DSA problem, and you have somehow reached a perfect solution. Now, the interviewer asks, 'Could you please let me know the time and space complexity for this code?'

After gaining experience in conducting interviews, I have observed a moment of confusion among DSA beginners when answering this question. They usually struggle to explain the time and space complexity of their code solution.

This was my experience; I would like to add your experience in the comment section, though. But let’s try to make it easy for DSA beginners.

If you’re starting your DSA journey, these aren’t just buzzwords — they are your best friends when it comes to writing efficient code.

In this series, I’ll walk you through how to calculate time and space complexity for any piece of code, from the simplest loops to more advanced recursive algorithms.

Whether you’re prepping for coding interviews or want to write cleaner, faster programs, this is the place to begin.

What Is Time and Space Complexity?

Before jumping into the calculations of your code, let’s understand what this is.

⏱ Time Complexity

Time complexity tells you how the runtime of your code increases as the input size grows. It’s not about how fast your computer runs the code — it’s about how the number of operations changes with input size.

A small motivation to move ahead, sometimes traversing an array from right to left, can give you better time complexity than traversing it from left to right. It’s a reminder that a slight change in perspective can lead to significant improvements in time complexity.

Think of it like this:

“If I double the input size, will the number of operations double, quadruple, or barely change?”

🧠 Space Complexity

Space complexity, on the other hand, is about how much extra memory your algorithm needs to run, apart from the input. It's pretty simple to calculate. It includes things like temporary variables, data structures (arrays, maps, stacks, etc.), and function call stacks (especially in recursion).

Why It Even Matters

Initially, the question arises: why is this important? What will happen if I don’t take care of it?

As a beginner, you might wonder after getting rejected by your dream company, “Where did I go wrong? I gave the correct solution.” But here’s the catch — you missed what they were genuinely looking for. They weren’t just testing correctness; they wanted to see if you could optimize. Can your solution handle large inputs efficiently? That’s what sets great problem-solvers apart.

If your solution runs slowly or crashes due to memory limits, it’s likely because the time or space complexity is too high.

Knowing how to analyze and optimize complexity can help you pass all test cases in coding contests, crack FAANG interviews, and write scalable software.

Let’s Break It Down With Examples

I suppose I have answered the What and Why of this topic.

Now, let’s examine a few simple code snippets in Java and analyze them step by step.

📌 Example 1: Constant Time — O(1)

int getFirstElement(int[] arr) {
    return arr[0];
}
  • Time Complexity: O(1) — Just one operation regardless of input size.

  • Space Complexity: O(1) — No extra memory used.

No matter how big the array is, it just returns the first element.

📌 Example 2: Linear Time — O(n)

int sum(int[] arr) {
int total = 0;
for (int num : arr) {
    total += num;
}
return total;
}
  • Time Complexity: O(n) — You visit each element once.

  • Space Complexity: O(1) — You only store a single total variable.

As the input size increases, the number of steps increases proportionally.

📌 Example 3: Quadratic Time — O(n²)

void printPairs(int[] arr) {
    for (int i = 0; i < arr.length; i++) {
        for (int j = 0; j < arr.length; j++) {
            System.out.println(arr[i] + ", " + arr[j]);
        }
    }
}
  • Time Complexity: O(n²) — Two nested loops, each running in n time.

  • Space Complexity: O(1)

For an array of 10 elements, this prints 100 pairs. For 100 elements, it prints 10,000.

📌 Example 4: Logarithmic Time — O(log n)

int binarySearch(int[] arr, int key) {
    int left = 0, right = arr.length - 1;
    while (left <= right) {
        int mid = left + (right - left) / 2;
        if (arr[mid] == key) return mid;
        if (arr[mid] < key) left = mid + 1;
        else right = mid - 1;
    }
    return -1;
}
  • Time Complexity: O(log n) — You cut the search space in half each time.

  • Space Complexity: O(1)

Very efficient for searching in sorted arrays.

Nested vs Sequential Loops: Know the Difference

🌀 Nested Loops (O(n²))

for (int i = 0; i < n; i++) {
    for (int j = 0; j < n; j++) {
        System.out.println(i + " " + j);
    }
}

🔄 Sequential Loops (O(n))

for (int i = 0; i < n; i++) {
    System.out.println(i);
}
for (int j = 0; j < n; j++) {
    System.out.println(j);
}

Nested loops multiply time, while sequential loops add time.

A Quick Checklist to Analyze Complexity

Whenever you come across the code, ask yourself these questions.

  • Are there loops?
    -
    One loop → O(n)
    - Two nested loops → O(n²)

  • Is recursion used?
    -
    Analyze the number and depth of recursive calls (we’ll cover this in Part 2).

  • Are data structures used?
    -
    Look for space used in arrays, hash maps, etc.

  • Ignore constants
    -
    O(2n) is still O(n)

  • Check for hidden recursion or nested calls

What’s Coming in Part 2?

In Part 2, we’ll dive into:

  • How to analyze recursive code (like factorial and Fibonacci)

  • Visualizing recursion trees

  • Space complexity in recursion

  • Divide and conquer examples like Merge Sort

In further parts, I will cover more examples so that you get a clear understanding of time and space complexity, and you no longer hussle at that point in the interview.

Final Thoughts

Time and space complexity aren’t just abstract concepts. They’re a practical way to judge how scalable and efficient your code is.

Start small. Practice spotting time and space costs in your daily coding. And remember:

Writing efficient code is a habit. Mastering time and space complexity is the first step toward it.

Subscribe to my newsletter to stay updated with the next part in this DSA series!

0
Subscribe to my newsletter

Read articles from Neetika Khandelwal directly inside your inbox. Subscribe to the newsletter, and don't miss out.

Written by

Neetika Khandelwal
Neetika Khandelwal

Hi, I’m Neetika — a Technical Content Writer with 3+ years of experience crafting developer-focused blogs on Java, DSA, Machine Learning, AI, and Web Development. I love simplifying complex concepts into beginner-friendly, SEO-optimized content that educates, inspires, and ranks. I’ve written for top tech platforms like GeeksforGeeks, Medium, and Tutorialspoint. Whether it's a deep-dive tutorial, hands-on coding guide, or interview-focused content — I aim to make learning enjoyable and practical.