Complexity

NameComplexity
Selection sortAll case:
Bubble SortAll case:
Shell sortWorst:
Early-termination Bubble sortBest:
Worst:
Avg:
Insertion sortBest:
Worst:
Avg:
Merge sortAll case:
Quick sortBest:
Worst:
Avg:
Avg case 39% faster than merge sort
Heap sortBuild + Deque
DFS, BFSAdj Matrix:
Adj List:
Binary searchBest:
Worst:
If the array is sorted or has access to any positon
BST SearchBest:
Average:
Worst:
If it’s a stick, that’s
Search in sorted listMerge sort + binary search

Worth to presort when:
AVL treesHeight:
Search and Insert:
Delete:
Rebalance (rotations):
Disadvantages: Rotations are frequent and implementation is complex
2-3 treesHeight:
SEARCH, INSERT, DELETE:
Rebalancing: <
HeapsHeight:
Repair:
C(n) =
Building:
Distribution sortingWorst case: if
Space:
Hash tablesList
- Insert:
- Delete:
- Search:
Balance Tree
- Insert:
- Delete:
- Search:
Array
- Insert:
- Delete:
- Search:
Open Hashing
- Insert:
- Delete: propotional to list’s length
- Search: propotional to list’s length
- Average:
Prim’s algorithmMin-heap + Adjacency list
-
Fibonacci heap + adjacency list
-
Kruskal’s algorithmIf there are edges:
-
If there is no edge:
-
Dijkstra’s AlgorithmMin-heap:
-
Huffman’s CodeMin-heap:
-
Sorted by probabilities:
-
Edit distanceWorse and Average:
- Construction: where n and m is length of 2 strings.
- Backtrace complexity:
Knapsnack problemBuild table:
Backtrace complexity:
Warshall’s algorithmComplexity:
Space:
DFS & BFS: for n, for 1
If not many: