Skip to main content

You are not logged in. Your edit will be placed in a queue until it is peer reviewed.

We welcome edits that make the post easier to understand and more valuable for readers. Because community members review edits, please try to make the post substantially better than how you found it, for example, by fixing grammar or adding additional resources and hyperlinks.

Required fields*

11
  • Ooh, new info. I like this. Can you point me at something about the HashSet not being O(1) for fetching? Thanks :D Commented Feb 23, 2015 at 13:16
  • Nevermind, found this. stackoverflow.com/questions/8113246/… Which confirms what you said. (ish). Absolute worst case is O(n log(n)) for fetch. That makes it no better than my sort and search. Commented Feb 23, 2015 at 13:20
  • @Fogmeister it's the basics of hash tables. You use the hash of the elements as a key which points to a bin of elements with that hash. Since in the worst case no one can guarantee that the hashes of the different elements are unique, you might end up with all your elements in the same bin. That means that for every fetch you have to go over the whole bin - O(n) Commented Feb 23, 2015 at 13:20
  • 2
    @Fogmeister You can read about it on wikipedia - en.wikipedia.org/wiki/Hash_table Look at the upper right side of the page, there's a table with average and worst times. Commented Feb 23, 2015 at 13:21
  • 1
    Why would I care about worst case complexity if the worst case is sufficiently unlikely? Commented Feb 23, 2015 at 14:20