Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

23
  • 9
    As an example for a similar data structure with worse worst-case complexity, consider hash tables: O(1) typical lookup speed (most of the time, can find correct element immediately), but O(n) worst case (might have to search all elements). Commented Oct 5, 2021 at 17:35
  • 2
    @amon Yes exactly. If your hashing algorithm is bad, it devolves into a linear search of an unsorted list. Commented Oct 5, 2021 at 17:37
  • 3
    Adding to the fun, it's often also helpful to look at the prerequisites of an algorithm. Binary search is O(log n), but it requires that the input be sorted first. A linear search of an unsorted list is O(n), but it (obviously) doesn't require that the input be sorted. And, that's before real-world complications kick in, too. Commented Oct 6, 2021 at 4:37
  • 3
    I think you've made an error with the example - with 16 elements, the worst-case will examine 4 of them; the average case almost 3 (and both scale proportionally to log n, but with different constants of proportion). Commented Oct 6, 2021 at 15:04
  • 5
    I guess the key point is that average and worst-case scale the same way, but average is usually some fraction of worst-case. Commented Oct 6, 2021 at 15:36