big-o

Intuition:

  • \(O\): \(\leq\) (function in the symbol is
  • \(\theta\): \(=\)
  • \(\Omega\): \(\geq\) (function in the symbol is a lower bound)

Definition Intuition:

We say \(f\qty(n) = O\qty(g\qty(n))\) such that “when \(n\) gets big enough, \(f\qty(n)\) is bounded by at most a constant multiple of \(g\qty(n)\).

Definitions:

  • \(f(n) = O(g(n)) \Leftrightarrow \exists c, n_{0} > 0: \forall n > n_0, f(n) \leq c (g(n))\)
  • \(f(n) = \Omega(g(n)) \implies \exists n_{0}: \forall n > n_0, f(n) \geq c (g(n))\)
  • \(f(n) = \theta(g(n)) \implies \exists n_{0}: \forall n > n_0, f(n) \geq 1 (g(n)), f(n) \leq c (g(n))\)

Little ones:

  • \(o\): \(<\)
  • \(\omega\): \(>\)

pros and cons

pros

  • abstracts away conditions specific issues
  • makes analysis tractable
  • allows us to meaningfully compare how algorithms will perform will perform on large inputs

cons

  • only makes sense when n is large
  • we don’t think that hard about constant factors

exercise

\(n^{2}\) is not \(O\qty(n)\)

Suppose by contradiction \(n^{2} = O\qty(n)\). So then there is some positive \(c, n_0\) such that \(\forall n \geq n_0\), \(n^{2} < cn\).

Let’s divide both sides by \(n\), so we get \(\forall n \geq n_0, n \leq c\). But…. what can’t be true, since \(n \leq c\) cannot hold for \(n = n_0 + c + 1\).

Reaching contradiction.