Mathematical Foundations for Algorithm Analysis

Asymptotic Notation

Big O Definition

Definition: Function f(n) is in order of growth O(g(n)) if \textcolor{green}{f(n)\text{'s order of growth}} \le \textcolor{green}{g(n)\text{'s order of growth}} (within a constant multiple).

\boxed{ \text{Order of Growth: } f(n) \le c g(n) \quad \forall \quad n \ge n_0 } \\~\\ \small \textit{where $c$ is a positive constant, and} \\ \textit{$n_0$ is a non-negative integer}

Example: Order of growth, formally

10n is O(n^2)

5n+20 is O(n)

Key Properties

1. Every function bounds itself

f(n) \in O(f(n))

2. Upper and lower bounds are flipsides

f(n) \in O(g(n)) \text{ iff } g(n) \in \Omega ( f(n))

3. Transitivity

\text{If $f(n) \in O(g(n))$ and $g(n) \in O(h(n))$,} \\ \text{then $f(n) \in O(h(n))$}

Layman’s: “If A \le B and B \le C, then A \le C

4. Addition Rule

\text{If $f_1(n) \in O(g_1(n))$ and $f_2(n) \in O(g_2(n))$, then} \\ \text{$f_1 (n) + f_2 (n) \in O(max\{ g_1(n), g_2(n) \})$}

Layman’s: When adding functions, only the fastest-growing one matters.

Limit Method

Alternatively, instead of finding c and n_0, we can use limits to find the order of growth.

\lim_{n \to \infin} \frac{T(n)}{g(n)} = \begin{cases} 0&: \text{$T(n)$ grows slower than $g(n)$} \\ c > 0&: \text{$T(n)$ grows at the same rate as $g(n)$} \\ \\ \infin&: \text{$T(n)$ grows faster than $g(n)$} \end{cases}

Example: Comparing Orders of Growth

Q: Which functions grow faster?

  1. 10n v.s. n^2
  2. n(n+1)/2 v.s. n^2

A:

  1. n^2 grows faster
  2. They’re the same speed

Standard Complexity Cases

Some fundamental rules to remember.

1. Logarithm Base Doesn’t Matter

All logarithmic functions belong to the same class \Theta (\log n) no matter what the logarithm’s base is.

Why?: Change of base formula. (\log_a n = \frac{ \log n }{ \log a})

2. Polynomial Degree Rules

All polynomials of the same degree k belong to the same class:

a_k n^k + a_{k-1} n^{k-1} + ... + a_0 \in \Theta (n^k)

3. Exponential Bases Matter

Exponential functions a^n have different orders of growth for different a’s.

4. The Hierarchy

You should know the fundamental ordering of complexity classes.

ComplexityName
\Theta( 1 )constant
\Theta( \log n )logarithmic
\Theta( n )linear
\Theta( n \log n )n-\log-n or linearithmic
\Theta( n^2 )quadratic
\Theta( n^3 )cubic
\Theta( 2^n )exponential
\Theta( n! )factorial

Essential Summation Formulas

Notation Reference:

1. Counting

For simple loops that run n times. \sum_{l \le i \le u} 1 = 1 + 1 + ... + 1 = u - l + 1

Example: Simple loop
for i \leftarrow 1 to n
  do something constant

2. Arithmetic Series

For nested loops where inner loop depends on outer loop. \sum_{1 \le i \le n} i = 1 + 2 + ... + n = n(n+1)/2 \approx n^2 / 2 \in \Theta(n^2)

Example: Nested loop with dependency
for i \leftarrow 1 to n
  for j \leftarrow 1 to i
      do something constant

3. Sum of Squares

For loops where the inner work grows quadratically with the outer variable. \sum_{1 \le i \le n} i^2 = 1^2 + 2^2 + ... + n^2 = n(n+1)(2n+1)/6 \approx n^3 / 3 \in \Theta(n^3)

Example: Quadratic dependency
for i \leftarrow 1 to n
  for j \leftarrow 1 to i^2
      do something constant

4. Geometric Series

For algorithms that double/halve at each step. \sum_{0 \le i \le n} a^i = 1 + a + ... + a^n = (a^{n+1} - 1) / (a - 1) \text{ for any } a \ne 1

Example: Exponentially growing work
operations \leftarrow 1
for i \leftarrow 1 to n
  do 'operations' amount of work
  operations \leftarrow operations \large{*} 2

5. Summation Properties

Assorted properties for simplifying complex expressions. \sum ( a_i \pm b_i ) = \sum a_i \pm \sum b_i \sum c a_i = c \sum a_i \sum_{l \le i \le u} a_i = \sum_{l \le i \le m} a_i + \sum_{m + 1 \le i \le u} a_i