Best data structures course

Notes from a data structures course. Use as a supplement to the main topic files in data-structures/ and algorithms/.


S01E01: Algorithms, Time Complexity, Merge Sort

What is an algorithm and how to measure efficiency?

  • Algorithm: A well-defined procedure: given input data, produce the intended output.

  • Efficiency: Measured by time complexity — how the number of operations grows with input size, not wall-clock time (which depends on hardware).

  • Big O: Upper bound on growth. For example, (O(n)) means the number of operations is at most (C \cdot g(n)) for large (n) (e.g. (g(n) = n) for linear).

Time complexity for recursive algorithms

Recurrence relation describes the work. Example: a function that just recurses (n) times does (O(n)) work.

def f(n):
    if n == 0:
        return
    f(n - 1)
  • Base case: (n = 0) → constant work.

  • Recursive step: one call with (n - 1). Total calls: (n), so time is (O(n)).

For more complex recurrences (e.g. merge sort (T(n) = 2T(n/2) + O(n))), use the Master Theorem or expansion. See algorithms/divide-and-conquer.mdarrow-up-right.


Going further

Last updated