From "The Pragmatic Programmer"
🎧 Listen to Summary
Free 10-min PreviewAlgorithm Speed
Key Insight
Estimating algorithm resources, such as time and memory, is a crucial daily task for programmers, helping to choose between implementation options and predict how code scales with varying input sizes. Most nontrivial algorithms' performance is affected by input size, often not linearly. Some are sublinear (like binary search), while others are considerably worse, with performance degrading rapidly as input grows, making quick estimations vital for identifying potential bottlenecks before they become critical.
Big-O notation, written as O(n), provides a mathematical way to approximate an algorithm's worst-case time or memory usage, indicating how these values change as input 'n' increases. It focuses on the highest-order term, ignoring lower-order terms and constant factors, meaning it gives a relative growth rate, not actual execution times. Common Big-O categories include O(1) for constant time (array access), O(log n) for logarithmic (binary search), O(n) for linear (sequential search), O(n log n) (quicksort average), O(n^2) for quadratic (selection sort), O(n^3) for cubic (matrix multiplication), and O(2^n) for exponential (traveling salesman problem). The growth rate beyond O(n^2) quickly renders algorithms impractical for larger inputs.
Common sense estimation can often determine an algorithm's order: simple loops are O(n), nested loops are O(n^2), binary chops are O(log n), and divide-and-conquer algorithms (like quicksort's average case) are O(n log n). Combinatoric algorithms, involving permutations, quickly lead to factorial runtimes (O(n!)), which are extremely slow and often require heuristics. In practice, programmers should estimate the order of loops and nested loops, considering potential large input values. To address problems, try to reduce O(n^2) to O(n log n) and test estimates by running code with varied inputs and plotting results to observe the curve's shape. Always remember that the fastest algorithm isn't always the best; factors like small input sets or high setup costs can make simpler algorithms more appropriate, and premature optimization should be avoided unless a bottleneck is confirmed.
📚 Continue Your Learning Journey — No Payment Required
Access the complete The Pragmatic Programmer summary with audio narration, key takeaways, and actionable insights from Andrew Hunt, David Thomas.