Time Complexity

  • Called: Algorithm complexity analysis, Asymptotic behavior, Worst case analysis
  • Profilers can tell you how long an algorithm takes
  • 2n ∈ Θ(n), which is pronounced as “two n is theta of n”
  • Θ(1) is constant time
    • Any program without loops
    • It will take the same amount of time no matter how big n is
  • Θ(n) is linear
    • Pronounced theta of n
  • Θ(n2) is quadratic
    • Two nested loops
    • O(n2) is pronounced “big oh of n squared”.
  • Θ(log(n)) is logarithmic
    • Notice that in the i-th iteration, our array has n / 2i elements. This is because in every iteration we’re cutting our array into half, meaning we’re dividing its number of elements by two. This translates to multiplying the denominator with a 2. If we do that i times, we get n / 2i.
  • Θ(n!) is factorial
    • Traveling salesman
    • 5! is 120
  • When looking at a Ω (omega), the bar is at the bottom, so it is an (asymptotic) lower bound.
    • Big Omega means your algorithm will execute in no fewer steps than in the given expression(n2)
  • When looking at a Θ (theta), the bar is obviously in the middle. So it is an (asymptotic) tight bound.
    • When both conditions are true for the same expression (O and Ω are the same), you can use the big theta notation. This is the most precise notation
  • When handwriting O, you usually finish at the top, and draw a squiggle. Therefore O(n) is the upper bound of the function.
    • Big O means your algorithm will execute in no more steps than in given expression(n2)
  • Small o or omega is for when bounds are not tight
  • Computers can touch everything in memory in about a second (rough standard)

Logarithm

log defaults to log10 but we will default to 2

102=100 <-> log10100=2

lg is log2