previous | start | next

Problem

A machine executes 1 instruction per microsecond (that is, 1,000,000 instructions per second).

Algorithm k (for k = 1, 2, 3, 4) takes Tk(N) microseconds for input of size N.

What is the largest input size, N, that each algorithm can complete in 1 second?

      T1(N) = log(N)  (base 2)
      T2(N) = N
      T3(N) = N2
      T4(N) = N3
   


previous | start | next