Immutable collection algorithmic complexity

Andrew Arnott

I received some feedback from my recent BCL blog post on the prerelease of the immutable collections that my algorithm complexity table left a few important entries out. Here is the table again, with more data filled in (particularly around list indexer lookup and enumeration):

 

Mutable (amortized)

Mutable (worst case)

Immutable

Stack.Push

O(1)

O(n)

O(1)

Queue.Enqueue

O(1)

O(n)

O(1)

List.Add

O(1)

O(n)

O(log n)

List lookup by index O(1) O(1) O(log n)
List enumeration O(n) O(n) O(n)

HashSet.Add, lookup

O(1)

O(n)

O(log n)

SortedSet.Add

O(log n)

O(n)

O(log n)

Dictionary.Add

O(1)

O(n)

O(log n)

Dictionary lookup O(1) O(1) – or strictly O(n) O(log n)

SortedDictionary.Add

O(log n)

O(n log n)

O(log n)

A noteworthy trait to call out here is that where a List<T> can be efficiently enumerated using either a for loop or a foreach loop, ImmutableList<T> does a poor job inside a for loop, due to its O(log n) time for its indexer. It does fine when using foreach however. This is because ImmutableList<T> uses a binary tree to store its data instead of a simple array like List<T> uses. An array can be very quickly indexed into, whereas a binary tree must be walked down until the node with the desired index is found.

Another point to call out from the above table is that SortedSet<T> has the same complexity as ImmutableSortedSet<T>, and in fact in perf tests they show up as pretty close. That’s because they both use binary trees. The significant difference of course is that ImmutableSortedSet<T> uses an immutable one. Since ImmutableSortedSet<T> also offers a Builder class that allows mutation, you can have your immutability and performance too. Woot.

0 comments

Discussion is closed.

Feedback usabilla icon