Algorithm Complexity Cheat Sheet



Sep 12, 2020 Algorithms and Data Structures Cheatsheet We summarize the performance characteristics of classic algorithms and data structures for sorting, priority queues, symbol tables, and graph processing. Big o cheatsheet with complexities chart Big o complete Graph!Bigo graph1 Legend!legend3!Big o cheatsheet2!DS chart4!Searching chart5 Sorting Algorithms chart!sorting chart6!Heaps chart7!graphs chart8. HackerEarth is a global.

When measuring the efficiency of an algorithm, we usually take into account the time and space complexity. In this article, we will glimpse those factors on some sorting algorithms and data structures, also we take a look at the growth rate of those operations.

Big-O Complexity Chart

First, we consider the growth rate of some familiar operations, based on this chart, we can visualize the difference of an algorithm with O(1) when compared with O(n2). As the input larger and larger, the growth rate of some operations stays steady, but some grow further as a straight line, some operations in the rest part grow as exponential, quadratic, factorial.

Sorting Algorithms

In order to have a good comparison between different algorithms we can compare based on the resources it uses: how much time it needs to complete, how much memory it uses to solve a problem or how many operations it must do in order to solve the problem:

Algorithm complexity cheat sheet answer
  • Time efficiency: a measure of the amount of time an algorithm takes to solve a problem.
  • Space efficiency: a measure of the amount of memory an algorithm needs to solve a problem.
  • Complexity theory: a study of algorithm performance based on cost functions of statement counts.
Sorting AlgorithmsSpace ComplexityTime Complexity
Worst case Best case Average case Worst case

Bubble Sort
O(1)O(n)O(n2)O(n2)
HeapsortO(1)O(n log n)O(n log n)O(n log n)
Insertion SortO(1)O(n)O(n2)O(n2)
MergesortO(n)O(n log n)O(n log n)O(n log n)
QuicksortO(log n)O(n log n)O(n log n)O(n log n)
Selection SortO(1)O(n2)O(n2)O(n2)
ShellSortO(1)O(n)O(n log n2)O(n log n2)
Smooth SortO(1)O(n)O(n log n)O(n log n)
Tree SortO(n)O(n log n)O(n log n)O(n2)
Counting SortO(k)O(n + k)O(n + k)O(n + k)
CubesortO(n)O(n)O(n log n)O(n log n)

Data Structure Operations

In this chart, we consult some popular data structures such as Array, Binary Tree, Linked-List with 3 operations Search, Insert and Delete.

Data StructuresAverage CaseWorst Case
SearchInsertDeleteSearchInsertDelete
ArrayO(n)N/AN/AO(n)N/AN/A
AVL TreeO(log n)O(log n)O(log n)O(log n)O(log n)O(log n)
B-TreeO(log n)O(log n)O(log n)O(log n)O(log n)O(log n)
Binary SearchTreeO(log n)O(log n)O(log n)O(n)O(n)O(n)
Doubly Linked ListO(n)O(1)O(1)O(n)O(1)O(1)
Hash tableO(1)O(1)O(1)O(n)O(n)O(n)
Linked ListO(n)O(1)O(1)O(n)O(1)O(1)
Red-Black treeO(log n)O(log n)O(log n)O(log n)O(log n)O(log n)
Sorted ArrayO(log n)O(n)O(n)O(log n)O(n)O(n)
StackO(n)O(1)O(1)O(n)O(1)O(1)

Growth of Functions

The order of growth of the running time of an algorithm gives a simple characterization of the algorithm’s efficiency and also allows us to compare the relative performance of alternative algorithms.

Below we have the function n f(n) with n as an input, and beside it we have some operations which take input n and return the total time to calculate some specific inputs.

n f(n)log nnn log nn22nn!
100.003ns0.01ns0.033ns0.1ns1ns3.65ms
200.004ns0.02ns0.086ns0.4ns1ms77years
300.005ns0.03ns0.147ns0.9ns1sec8.4×1015yrs
400.005ns0.04ns0.213ns1.6ns18.3min
500.006ns0.05ns0.282ns2.5ns13days
1000.070.1ns0.644ns0.10ns4×1013yrs
1,0000.010ns1.00ns9.966ns1ms
10,0000.013ns10ns130ns100ms
100,0000.017ns0.10ms1.67ms10sec
1’000,0000.020ns1ms19.93ms16.7min
10’000,0000.023ns0.01sec0.23ms1.16days
100’000,0000.027ns0.10sec2.66sec115.7days
1,000’000,0000.030ns1sec29.90sec31.7 years


We summarize the performance characteristics of classic algorithms anddata structures for sorting, priority queues, symbol tables, and graph processing.

We also summarize some of the mathematics useful in the analysis of algorithms, including commonly encountered functions;useful formulas and appoximations; properties of logarithms;asymptotic notations; and solutions to divide-and-conquer recurrences.


Algorithm Complexity Cheat Sheet

Sorting.

The table below summarizes the number of compares for a variety of sortingalgorithms, as implemented in this textbook.It includes leading constants but ignores lower-order terms.
ALGORITHMCODEIN PLACESTABLEBESTAVERAGEWORSTREMARKS
selection sortSelection.java½ n 2½ n 2½ n 2n exchanges;
quadratic in best case
insertion sortInsertion.javan¼ n 2½ n 2use for small or
partially-sorted arrays
bubble sortBubble.javan½ n 2½ n 2rarely useful;
use insertion sort instead
shellsortShell.javan log3nunknownc n 3/2tight code;
subquadratic
mergesortMerge.java½ n lg nn lg nn lg nn log n guarantee;
stable
quicksortQuick.javan lg n2 n ln n½ n 2n log n probabilistic guarantee;
fastest in practice
heapsortHeap.javan2 n lg n2 n lg nn log n guarantee;
in place
n lg n if all keys are distinct

Fantasy Football Cheat Sheet


Priority queues.

The table below summarizes the order of growth of the running time ofoperations for a variety of priority queues, as implemented in this textbook.It ignores leading constants and lower-order terms.Except as noted, all running times are worst-case running times.
DATA STRUCTURECODEINSERTDEL-MINMINDEC-KEYDELETEMERGE
arrayBruteIndexMinPQ.java1nn11n
binary heapIndexMinPQ.javalog nlog n1log nlog nn
d-way heapIndexMultiwayMinPQ.javalogdnd logdn1logdnd logdnn
binomial heapIndexBinomialMinPQ.java1log n1log nlog nlog n
Fibonacci heapIndexFibonacciMinPQ.java1log n11 log n1
amortized guarantee


Symbol tables.

The table below summarizes the order of growth of the running time ofoperations for a variety of symbol tables, as implemented in this textbook.It ignores leading constants and lower-order terms.
worst caseaverage case
DATA STRUCTURECODESEARCHINSERTDELETESEARCHINSERTDELETE
sequential search
(in an unordered list)
SequentialSearchST.javannnnnn
binary search
(in a sorted array)
BinarySearchST.javalog nnnlog nnn
binary search tree
(unbalanced)
BST.javannnlog nlog nsqrt(n)
red-black BST
(left-leaning)
RedBlackBST.javalog nlog nlog nlog nlog nlog n
AVL
AVLTreeST.javalog nlog nlog nlog nlog nlog n
hash table
(separate-chaining)
SeparateChainingHashST.javannn1 1 1
hash table
(linear-probing)
LinearProbingHashST.javannn1 1 1
uniform hashing assumption


Graph processing.

The table below summarizes the order of growth of the worst-case running time and memory usage (beyond the memory for the graph itself)for a variety of graph-processing problems, as implemented in this textbook.It ignores leading constants and lower-order terms.All running times are worst-case running times.


Itools pro for mac os x torrent.

PROBLEMALGORITHMCODETIMESPACE
pathDFSDepthFirstPaths.javaE + VV
shortest path (fewest edges)BFSBreadthFirstPaths.javaE + VV
cycleDFSCycle.javaE + VV
directed pathDFSDepthFirstDirectedPaths.javaE + VV
shortest directed path (fewest edges)BFSBreadthFirstDirectedPaths.javaE + VV
directed cycleDFSDirectedCycle.javaE + VV
topological sortDFSTopological.javaE + VV
bipartiteness / odd cycleDFSBipartite.javaE + VV
connected componentsDFSCC.javaE + VV
strong componentsKosaraju–SharirKosarajuSharirSCC.javaE + VV
strong componentsTarjanTarjanSCC.javaE + VV
strong componentsGabowGabowSCC.javaE + VV
Eulerian cycleDFSEulerianCycle.javaE + VE + V
directed Eulerian cycleDFSDirectedEulerianCycle.javaE + VV
transitive closureDFSTransitiveClosure.javaV (E + V)V 2
minimum spanning treeKruskalKruskalMST.javaE log EE + V
minimum spanning treePrimPrimMST.javaE log VV
minimum spanning treeBoruvkaBoruvkaMST.javaE log VV
shortest paths (nonnegative weights)DijkstraDijkstraSP.javaE log VV
shortest paths (no negative cycles)Bellman–FordBellmanFordSP.javaV (V + E)V
shortest paths (no cycles)topological sortAcyclicSP.javaV + EV
all-pairs shortest pathsFloyd–WarshallFloydWarshall.javaV 3V 2
maxflow–mincutFord–FulkersonFordFulkerson.javaEV (E + V)V
bipartite matchingHopcroft–KarpHopcroftKarp.javaV ½ (E + V)V
assignment problemsuccessive shortest pathsAssignmentProblem.javan 3 log nn 2
Examples


Commonly encountered functions.

Here are some functions that are commonly encounteredwhen analyzing algorithms.
FUNCTIONNOTATIONDEFINITION
floor( lfloor x rfloor )greatest integer (; le ; x)
ceiling( lceil x rceil )smallest integer (; ge ; x)
binary logarithm( lg x) or (log_2 x)(y) such that (2^{,y} = x)
natural logarithm( ln x) or (log_e x )(y) such that (e^{,y} = x)
common logarithm( log_{10} x )(y) such that (10^{,y} = x)
iterated binary logarithm( lg^* x )(0) if (x le 1;; 1 + lg^*(lg x)) otherwise
harmonic number( H_n )(1 + 1/2 + 1/3 + ldots + 1/n)
factorial( n! )(1 times 2 times 3 times ldots times n)
binomial coefficient( n choose k )( frac{n!}{k! ; (n-k)!})


Useful formulas and approximations.

Here are some useful formulas for approximations that are widely used in the analysis of algorithms.
  • Harmonic sum: (1 + 1/2 + 1/3 + ldots + 1/n sim ln n)
  • Triangular sum: (1 + 2 + 3 + ldots + n = n , (n+1) , / , 2 sim n^2 ,/, 2)
  • Sum of squares: (1^2 + 2^2 + 3^2 + ldots + n^2 sim n^3 , / , 3)
  • Geometric sum: If (r neq 1), then(1 + r + r^2 + r^3 + ldots + r^n = (r^{n+1} - 1) ; /; (r - 1))
    • (r = 1/2): (1 + 1/2 + 1/4 + 1/8 + ldots + 1/2^n sim 2)
    • (r = 2): (1 + 2 + 4 + 8 + ldots + n/2 + n = 2n - 1 sim 2n), when (n) is a power of 2
  • Stirling's approximation: (lg (n!) = lg 1 + lg 2 + lg 3 + ldots + lg n sim n lg n)
  • Exponential: ((1 + 1/n)^n sim e; ;;(1 - 1/n)^n sim 1 / e)
  • Binomial coefficients: ({n choose k} sim n^k , / , k!) when (k) is a small constant
  • Approximate sum by integral: If (f(x)) is a monotonically increasing function, then( displaystyle int_0^n f(x) ; dx ; le ; sum_{i=1}^n ; f(i) ; le ; int_1^{n+1} f(x) ; dx)


Properties of logarithms.

  • Definition: (log_b a = c) means (b^c = a).We refer to (b) as the base of the logarithm.
  • Special cases: (log_b b = 1,; log_b 1 = 0 )
  • Inverse of exponential: (b^{log_b x} = x)
  • Product: (log_b (x times y) = log_b x + log_b y )
  • Division: (log_b (x div y) = log_b x - log_b y )
  • Finite product: (log_b ( x_1 times x_2 times ldots times x_n) ; = ; log_b x_1 + log_b x_2 + ldots + log_b x_n)
  • Changing bases: (log_b x = log_c x ; / ; log_c b )
  • Rearranging exponents: (x^{log_b y} = y^{log_b x})
  • Exponentiation: (log_b (x^y) = y log_b x )


Algorithm Complexity Cheat SheetAlgorithm Complexity Cheat Sheet

Aymptotic notations: definitions.

NAMENOTATIONDESCRIPTIONDEFINITION
Tilde(f(n) sim g(n); )(f(n)) is equal to (g(n)) asymptotically
(including constant factors)
( ; displaystyle lim_{n to infty} frac{f(n)}{g(n)} = 1)
Big Oh(f(n)) is (O(g(n)))(f(n)) is bounded above by (g(n)) asymptotically
(ignoring constant factors)
there exist constants (c > 0) and (n_0 ge 0) such that (0 le f(n) le c cdot g(n)) forall (n ge n_0)
Big Omega(f(n)) is (Omega(g(n)))(f(n)) is bounded below by (g(n)) asymptotically
(ignoring constant factors)
( g(n) ) is (O(f(n)))
Big Theta(f(n)) is (Theta(g(n)))(f(n)) is bounded above and below by (g(n)) asymptotically
(ignoring constant factors)
( f(n) ) is both (O(g(n))) and (Omega(g(n)))
Little oh(f(n)) is (o(g(n)))(f(n)) is dominated by (g(n)) asymptotically
(ignoring constant factors)
( ; displaystyle lim_{n to infty} frac{f(n)}{g(n)} = 0)
Little omega(f(n)) is (omega(g(n)))(f(n)) dominates (g(n)) asymptotically
(ignoring constant factors)
( g(n) ) is (o(f(n)))


Common orders of growth.

NAMENOTATIONEXAMPLECODE FRAGMENT
Constant(O(1))array access
arithmetic operation
function call
Logarithmic(O(log n))binary search in a sorted array
insert in a binary heap
search in a red–black tree
Linear(O(n))sequential search
grade-school addition
BFPRT median finding
Linearithmic(O(n log n))mergesort
heapsort
fast Fourier transform
Quadratic(O(n^2))enumerate all pairs
insertion sort
grade-school multiplication
Cubic(O(n^3))enumerate all triples
Floyd–Warshall
grade-school matrix multiplication
Polynomial(O(n^c))ellipsoid algorithm for LP
AKS primality algorithm
Edmond's matching algorithm
Exponential(2^{O(n^c)})enumerating all subsets
enumerating all permutations
backtracing search


Asymptotic notations: properties.

  • Reflexivity: (f(n)) is (O(f(n))).
  • Constants: If (f(n)) is (O(g(n))) and ( c > 0 ),then (c cdot f(n)) is (O(g(n)))).
  • Products: If (f_1(n)) is (O(g_1(n))) and ( f_2(n) ) is (O(g_2(n)))),then (f_1(n) cdot f_2(n)) is (O(g_1(n) cdot g_2(n)))).
  • Sums: If (f_1(n)) is (O(g_1(n))) and ( f_2(n) ) is (O(g_2(n)))),then (f_1(n) + f_2(n)) is (O(max { g_1(n) , g_2(n) })).
  • Transitivity: If (f(n)) is (O(g(n))) and ( g(n) ) is (O(h(n))),then ( f(n) ) is (O(h(n))).
  • Polynomials: Let (f(n) = a_0 + a_1 n + ldots + a_d n^d) with(a_d > 0). Then, ( f(n) ) is (Theta(n^d)).
  • Logarithms and polynomials: ( log_b n ) is (O(n^d)) for every ( b > 0) and every ( d > 0 ).
  • Exponentials and polynomials: ( n^d ) is (O(r^n)) for every ( r > 0) and every ( d > 0 ).
  • Factorials: ( n! ) is ( 2^{Theta(n log n)} ).
  • Limits: If ( ; displaystyle lim_{n to infty} frac{f(n)}{g(n)} = c)for some constant ( 0 < c < infty), then(f(n)) is (Theta(g(n))).
  • Limits: If ( ; displaystyle lim_{n to infty} frac{f(n)}{g(n)} = 0),then (f(n)) is (O(g(n))) but not (Theta(g(n))).
  • Limits: If ( ; displaystyle lim_{n to infty} frac{f(n)}{g(n)} = infty),then (f(n)) is (Omega(g(n))) but not (O(g(n))).


Here are some examples.

FUNCTION(o(n^2))(O(n^2))(Theta(n^2))(Omega(n^2))(omega(n^2))(sim 2 n^2)(sim 4 n^2)
(log_2 n)
(10n + 45)
(2n^2 + 45n + 12)
(4n^2 - 2 sqrt{n})
(3n^3)
(2^n)


Divide-and-conquer recurrences.

For each of the following recurrences we assume (T(1) = 0)and that (n,/,2) means either (lfloor n,/,2 rfloor) or(lceil n,/,2 rceil).
RECURRENCE(T(n))EXAMPLE
(T(n) = T(n,/,2) + 1)(sim lg n)binary search
(T(n) = 2 T(n,/,2) + n)(sim n lg n)mergesort
(T(n) = T(n-1) + n)(sim frac{1}{2} n^2)insertion sort
(T(n) = 2 T(n,/,2) + 1)(sim n)tree traversal
(T(n) = 2 T(n-1) + 1)(sim 2^n)towers of Hanoi
(T(n) = 3 T(n,/,2) + Theta(n))(Theta(n^{log_2 3}) = Theta(n^{1.58..}))Karatsuba multiplication
(T(n) = 7 T(n,/,2) + Theta(n^2))(Theta(n^{log_2 7}) = Theta(n^{2.81..}))Strassen multiplication
(T(n) = 2 T(n,/,2) + Theta(n log n))(Theta(n log^2 n))closest pair


Master theorem.

Let (a ge 1), (b ge 2), and (c > 0) and suppose that(T(n)) is a function on the non-negative integers that satisfiesthe divide-and-conquer recurrence$$T(n) = a ; T(n,/,b) + Theta(n^c)$$with (T(0) = 0) and (T(1) = Theta(1)), where (n,/,b) meanseither (lfloor n,/,b rfloor) or either (lceil n,/,b rceil).
  • If (c < log_b a), then (T(n) = Theta(n^{log_{,b} a}))
  • If (c = log_b a), then (T(n) = Theta(n^c log n))
  • If (c > log_b a), then (T(n) = Theta(n^c))
Remark: there are many different versions of the master theorem. The Akra–Bazzi theoremis among the most powerful.

Algorithm Complexity Cheat Sheet Pdf


Big-o Algorithm Complexity Cheat Sheet Pdf

Last modified on September 12, 2020.
Copyright © 2000–2019Robert SedgewickandKevin Wayne.All rights reserved.