The term best-case performance is used in computer science to describe an algorithm's behavior under optimal conditions. For example, the best case for a simple linear search on a list occurs when the desired element is the first element of the list.
Development and choice of algorithms is rarely based on best-case performance: most academic and commercial enterprises are more interested in improving average-case complexity and worst-case performance. Algorithms may also be trivially modified to have good best-case running time by hard-coding solutions to a finite set of inputs, making the measure almost meaningless.1
Further information: average-case complexity, amortized analysis, and worst-case complexity
Worst-case performance analysis and average-case performance analysis have some similarities, but in practice usually require different tools and approaches.
Determining what typical input means is difficult, and often that average input has properties which make it difficult to characterise mathematically (consider, for instance, algorithms that are designed to operate on strings of text). Similarly, even when a sensible description of a particular "average case" (which will probably only be applicable for some uses of the algorithm) is possible, they tend to result in more difficult analysis of equations.2
Worst-case analysis gives a safe analysis (the worst case is never underestimated), but one which can be overly pessimistic, since there may be no (realistic) input that would take this many steps.
In some situations it may be necessary to use a pessimistic analysis in order to guarantee safety. Often however, a pessimistic analysis may be too pessimistic, so an analysis that gets closer to the real value but may be optimistic (perhaps with some known low probability of failure) can be a much more practical approach. One modern approach in academic theory to bridge the gap between worst-case and average-case analysis is called smoothed analysis.
When analyzing algorithms which often take a small time to complete, but periodically require a much larger time, amortized analysis can be used to determine the worst-case running time over a (possibly infinite) series of operations. This amortized cost can be much closer to the average cost, while still providing a guaranteed upper limit on the running time. So e.g. online algorithms are frequently based on amortized analysis.
The worst-case analysis is related to the worst-case complexity.3
Many algorithms with bad worst-case performance have good average-case performance. For problems we want to solve, this is a good thing: we can hope that the particular instances we care about are average. For cryptography, this is very bad: we want typical instances of a cryptographic problem to be hard. Here methods like random self-reducibility can be used for some specific problems to show that the worst case is no harder than the average case, or, equivalently, that the average case is no easier than the worst case.
On the other hand, some data structures like hash tables have very poor worst-case behaviors, but a well written hash table of sufficient size will statistically never give the worst case; the average number of operations performed follows an exponential decay curve, and so the run time of an operation is statistically bounded.
See also: Sorting algorithm § Comparison of algorithms
See also: Search data structure § Asymptotic worst-case analysis
Introduction to Algorithms (Cormen, Leiserson, Rivest, and Stein) 2001, Chapter 2 "Getting Started".In Best-case complexity, it gives the lower bound on the running time of the algorithm of any instances of input. /wiki/Best-case_complexity ↩
Spielman, Daniel; Teng, Shang-Hua (2009), "Smoothed analysis: an attempt to explain the behavior of algorithms in practice" (PDF), Communications of the ACM, 52 (10), ACM: 76-84, doi:10.1145/1562764.1562785, S2CID 7904807 /wiki/Daniel_Spielman ↩
"Worst-case complexity" (PDF). Archived (PDF) from the original on 2011-07-21. Retrieved 2008-11-30. http://www.fsz.bme.hu/~szirmay/ray6.pdf ↩