Time and Space Complexity

Overview

Time and Space Complexity

When analyzing an algorithm, we care about how efficient it is. Two main measures are:

Type What It Measures Goal
Time Complexity How long the algorithm takes (number of operations as input grows). Make it as fast as possible.
Space Complexity How much memory it uses (extra variables, data structures, recursion stack, etc.). Make it as memory-efficient as possible.

Big-O Notation

Big-O notation describes how fast the runtime grows as the input size grows. It doesn't care about exact counts, only the growth trend.

If algorithm 1 takes 2n + 2 steps, algorithm 2 takes 5n + 100 steps. Then as n becomes very large, both grow roughly proportionally to n. So both are O(n) (linear).

In Big-O, multiplicative constants and additive constants are dropped, they have no impact on how fast runtime scales with n. It describes rate of growth.

Time Complexity

alt text

We measure how the runtime grows as the input size n increases, it ignores constant factors and machine speed, only focuses on growth rate.

Common complexities:

Complexity Example Description
O(1) Accessing an array element Constant time — doesn’t depend on n.
O(log n) Binary search Each step cuts problem size in half.
O(n) Linear search, traversing a list Grows directly with input size.
O(n log n) Merge sort, quicksort Common for efficient sorting.
O(n²) Nested loops Grows quadratically (e.g., comparing all pairs).
O(2ⁿ) Recursive subset generation Exponential — grows very fast.
O(n!) Permutation generation Factorial growth — impractical for large n.

Space Complexity

Space complexity counts extra memory used by the algorithm, not including the input itself.

Example

1def sum_array(arr):
2    total = 0  # one variable
3    for num in arr:
4        total += num
5    return total
  • Time complexity: O(n), loops once aver arr
  • Space complexity: O(1), only uses 'total', constant extra space.

Example Comparison:

Algorithm Time Complexity Space Complexity
Binary Search O(log n) O(1)
Merge Sort O(n log n) O(n)
Quick Sort O(n log n) average, O(n²) worst O(log n)
BFS / DFS on Graph O(V + E) O(V)
Dynamic Programming O(n) to O(n²) O(n) or more

Examples

1. Sum of Array

1def sum_array(arr):
2  total = 0
3  for num in arr:
4    total += num
5  return total

Time complexity: We count how many basic operations happen as the input size grows

1def sum_array(arr):     # Operation count
2  total = 0             # 1
3  for num in arr:       # n
4    total += num        # n
5  return total          # 1

Total = 2n + 2, drop constants, $O(n)$ liner time complexity.

Space complexity: We count extra variables created by the algorithm.

  • total 1 variable, num 1 variable. Both are constant size, independent of n.

Space complexity = $O(1)$, constant space.

2. Print All Pairs in a List

1def print_all_pairs(arr):           # Operation count
2    for i in range(len(arr)):       # n
3        for j in range(len(arr)):   # n
4            print(arr[i], arr[j])

Time complexity: Total operations is $n^2$, so time complexity is $O(n^2)$ (quadratic time)

Space complexity: It only uses the loop variables i,j, no new data structure, so space complesity is $O(1)$, constant.

3. Build a Prefix Sum Array

1def prefix_sums(arr):           # Operation Count     Extra Space Used
2  prefix = []                   # 1                   n
3  current_sum = 0               # 1                   1
4  for num in arr:               # n
5    current_sum += num          # n
6    prefix.append(current_sum)  # n
7  return prefix                 # 1

Time complexity is $O(n)$.

Space complexity is $O(n)$.

When algorithm creates new data proportional to the input size, both time and space often scale linearly O(n).

4. Merge Sort

 1def merge_sort(arr):
 2  if len(arr) <= 1:
 3    return arr
 4
 5  mid = len(arr) // 2
 6  left = merge_sort(arr[:mid])
 7  right = merge_sort(arr[mid:])
 8
 9  return merge(left, right)
10
11def merge(left, right):
12  merged = []
13  i = j = 0
14  while i < len(left) and j < len(right):
15    if left[i] <= right[j]:
16      merged.append(left[i])
17      i += 1
18    else:
19      merged.append(right[j])
20      j += 1
21
22  merged.extend(left[i:])
23  merged.extend(right[j;])
24  return merged

Time complexity:

  • Splits array in half ($log_2 n$ levels)
  • Merges all elements once per level (n)

$T(n) = n logn$

Space complexity:

  • Each merge creates a temporary list merged of size n, even though recursion creates multiple sublists, they exist on different recursion levels, not all at once. $O(n)$

Divide-and-conquer algorithms like merge sort spend $log n$ levels of recursion, each processing n elements, so O(n log n).

 1def binary_search(arr, target): # Operation
 2  low, high = 0, len(arr) - 1   # 1
 3  while low <= high:            # O(log n) — halves range each loop
 4    mid = (low + high) // 2     # 1
 5    if arr[mid] == target:      # 1
 6      return mid                # 1
 7    elif arr[mid] < target:     # 1
 8      low = mid + 1
 9    else:
10      high = mid -1
11  return -1

Time and Space both are: $O(log n)$

Halving repeatedly, log n growth. Binary search is one of the best examples of logarithmic time, extremely efficient on large sorted data.

6. Recursive Fibonacci

1def fibonacci(n):
2  if n <= 1:
3    return n
4  return fibonacci(n-1) + fibonacci(n-2)

Time complexity: each call spawns 2 more recursive calls, exponential growth, $O(2^n)$ Space complexity: $O(n)$, recursion depth reaches n.

7. Fibonacci with Dynamic Programming (Memoization)

1def fibonacci_memo(n memo={})
2  if n in memo:
3    return memo[n]
4  if n <= 1:
5    return n
6  memo[n] = fibonacci_memo(n - 1, memo) + fibonacci_memo(n - 2, memo)
7  return memo[n]

Time complexity: Each number is computed once, then cached.

Space complexity: memo stores up to n results, recursion depth = n.

8. Matrix Path (Minimum Path Sum)

Given a grid of non-negative numbers, find the path from top-left to bottom-right with the minimum sum, moving only right or down.

 1def min_path_sum(grid):
 2    m, n = len(grid), len(grid[0])
 3    dp = [[0]*n for _ in range(m)]
 4    dp[0][0] = grid[0][0]
 5
 6    for i in range(1, m):
 7        dp[i][0] = dp[i-1][0] + grid[i][0]
 8    for j in range(1, n):
 9        dp[0][j] = dp[0][j-1] + grid[0][j]
10
11    for i in range(1, m):
12        for j in range(1, n):
13            dp[i][j] = grid[i][j] + min(dp[i-1][j], dp[i][j-1])
14
15    return dp[m-1][n-1]
Measure Complexity Why
Time O(m × n) Fill every cell once
Space O(m × n) DP table of same size (can optimize to O(n))

9. Sorting Algorithms

Category Algorithm Average Time Space Stable Notes
Simple (O(n²)) Bubble Sort O(n²) O(1) Repeatedly swaps adjacent elements
Selection Sort O(n²) O(1) Finds min each pass
Insertion Sort O(n²) O(1) Good for small or nearly sorted data
Efficient (O(n log n)) Merge Sort O(n log n) O(n) Divide & Conquer, always stable
Quick Sort O(n log n) avg
O(n²) worst
O(log n) Fast in practice; in-place
Heap Sort O(n log n) O(1) Uses binary heap
Linear (O(n)) special cases Counting Sort O(n + k) O(k) Works only for integers in range
Radix Sort O(d·(n + k)) O(n + k) For integers/strings by digit
Bucket Sort O(n + k) O(n + k) For uniformly distributed numbers
Other / Hybrid Shell Sort ~O(n¹·³) O(1) Improved insertion sort
Timsort O(n log n) O(n) Python’s built-in sort algorithm (hybrid of merge + insertion)
comments powered by Disqus