Home > CS3310: Design and Analysis of Algorithms > Mathematical Foundations for Algorithm AnalysisMathematical Foundations for Algorithm Analysis
Definition: Function f(n) is in order of growth O(g(n)) if \textcolor{green}{f(n)\text{'s order of growth}} \le \textcolor{green}{g(n)\text{'s order of growth}} (within a constant multiple).
\boxed{
\text{Order of Growth: } f(n) \le c g(n) \quad \forall \quad n \ge n_0
} \\~\\
\small
\textit{where $c$ is a positive constant, and} \\
\textit{$n_0$ is a non-negative integer} 10n is O(n^2) 5n+20 is O(n)Example: Order of growth, formally
f(n) \in O(f(n))
f(n) \in O(g(n)) \text{ iff } g(n) \in \Omega ( f(n))
\text{If $f(n) \in O(g(n))$ and $g(n) \in O(h(n))$,} \\ \text{then $f(n) \in O(h(n))$}
Layman’s: “If A \le B and B \le C, then A \le C”
\text{If $f_1(n) \in O(g_1(n))$ and $f_2(n) \in O(g_2(n))$, then} \\ \text{$f_1 (n) + f_2 (n) \in O(max\{ g_1(n), g_2(n) \})$}
Layman’s: When adding functions, only the fastest-growing one matters.
- e.g., If you have an algorithm that does O(n^2) work and then O(n) work, the total is O(n^2) because the smaller term gets absorbed.
Alternatively, instead of finding c and n_0, we can use limits to find the order of growth.
\lim_{n \to \infin} \frac{T(n)}{g(n)} =
\begin{cases}
0&:
\text{$T(n)$ grows slower than $g(n)$} \\
c > 0&:
\text{$T(n)$ grows at the same rate as $g(n)$} \\
\\
\infin&:
\text{$T(n)$ grows faster than $g(n)$}
\end{cases} Q: Which functions grow faster? A:Example: Comparing Orders of Growth
Some fundamental rules to remember.
All logarithmic functions belong to the same class \Theta (\log n) no matter what the logarithm’s base is.
Why?: Change of base formula. (\log_a n = \frac{ \log n }{ \log a})
All polynomials of the same degree k belong to the same class:
a_k n^k + a_{k-1} n^{k-1} + ... + a_0 \in \Theta (n^k)
Exponential functions a^n have different orders of growth for different a’s.
You should know the fundamental ordering of complexity classes.
Complexity | Name |
---|---|
\Theta( 1 ) | constant |
\Theta( \log n ) | logarithmic |
\Theta( n ) | linear |
\Theta( n \log n ) | n-\log-n or linearithmic |
\Theta( n^2 ) | quadratic |
\Theta( n^3 ) | cubic |
\Theta( 2^n ) | exponential |
\Theta( n! ) | factorial |
Notation Reference:
- l: Lower bound
- u: Upper bound
- m: Middle point (for splitting sums)
For simple loops that run n times. \sum_{l \le i \le u} 1 = 1 + 1 + ... + 1 = u - l + 1
for i \leftarrow 1 to n
do something constant
For nested loops where inner loop depends on outer loop.
\sum_{1 \le i \le n} i = 1 + 2 + ... + n =
n(n+1)/2 \approx n^2 / 2 \in \Theta(n^2)Example: Nested loop with dependency
for i \leftarrow 1 to n
for j \leftarrow 1 to i
do something constant
For loops where the inner work grows quadratically with the outer variable.
\sum_{1 \le i \le n} i^2 = 1^2 + 2^2 + ... + n^2 =
n(n+1)(2n+1)/6 \approx n^3 / 3 \in \Theta(n^3)Example: Quadratic dependency
for i \leftarrow 1 to n
for j \leftarrow 1 to i^2
do something constant
For algorithms that double/halve at each step. \sum_{0 \le i \le n} a^i = 1 + a + ... + a^n = (a^{n+1} - 1) / (a - 1) \text{ for any } a \ne 1
operations \leftarrow 1
for i \leftarrow 1 to n
do 'operations' amount of work
operations \leftarrow operations \large{*} 2
Assorted properties for simplifying complex expressions. \sum ( a_i \pm b_i ) = \sum a_i \pm \sum b_i \sum c a_i = c \sum a_i \sum_{l \le i \le u} a_i = \sum_{l \le i \le m} a_i + \sum_{m + 1 \le i \le u} a_i