Usually O(log(N)) or O(N) is great, but that was the performance for one execution each method.
An application that prints the frequency of occurrence of each word in a document with N total words and M distinct words, must do N get operations, plus N put operations to insert the words into a symbol table.
Then it must iterate through the M distinct keys and do M more get operations.
Comparing the two classes just for building the symbol table:
class | cost for insertion | Total |
---|---|---|
SequentialSearchST | N gets: N * O(N) = O(N2) N puts: N * O(N) = O(N2) total: O(N2) + O(N2) |
= O(N2) |
BinarySearchSt | N gets: N * O(log(N)) = O(Nlog(N)) N puts: N * O(N) = O(N2) total: O(Nlog(N)) + O(N2) |
=O(N2) |
See code examples