Definition: For a sufficiently large n, it provides upper bound on order of growth.

  • Determine order of growth as input size increases.
  • We only care about the most significant portion (dominant term) of complexity.

Benchmarks vs Big-oh notation isn't always useful

Complexity analysis can be very useful, but there are problems with it too. However benchmarks take into account various practical underlining factors. eg, the effect on paging as virtual memory usage grows, actual data structure implementation

  • Hard to analyze
  • Average case unknown
  • Unknown constant. Big-oh analysis only tells you how it grows with the size of the problem, not actual time
  • Small data sets. For small n, efficiency may not be important. e.g. sorting on Flash ROM better to do selection sort as it minimizes swaps

Big-O Complexity Chart


Resources -

http://stackoverflow.com/questions/487258/plain-english-explanation-of-big-o
http://bigocheatsheet.com/