Complexity
Let g(n) be a function on natural numbers.
g(n) is eventually positive if g(n) > 0 for all n ³
some fixed integer. We'll only consider eventually positive functions in this course.
Define O(g(n)) to be the set of all functions f(n) on natural numbers n
such that there exist constant c > 0 and integer n0
³ 0 such that
In other words, f(n) is in O(g(n)) if and only if f(n) is bounded above
by a constant multiple of g(n) for sufficiently large n.
If f(n) is in O(g(n)), then we either write
|
f(n) Î O(g(n)), or
f(n) = O(g(n)). |
|
Note that the latter is an abuse of the equality notation.
We write g(n) = W(f(n)) iff f(n) = O(g(n)). In this case, we say
``f(n) is of order at most g(n)", and ``g(n) is of order
at least f(n)". One can also say ``g(n) is an asymptotic upper
bound of f(n)" and ``f(n) is an asymptotic lower bound of g(n)."
Examples:
Define Q(g(n)) to be the set of all functions f(n) on natural
numbers such that
|
f(n) = O(g(n)) and g(n) = O(f(n)). |
|
These conditions can be restated as: there exist constants c1, c2 > 0
and integer n0 ³ 0 such that
|
n ³ n0 Þ c1 g(n)
£ f(n) £ c2 g(n). |
|
In other words, f(n) is in Q(g(n)) if and only if f(n) is bounded
between two constant multiples of g(n) for sufficiently large n.
If f(n) is in Q(g(n)), then we can either write
|
f(n) Î Q(g(n)), or f(n) = Q(g(n)). |
|
We say that ``f(n) and g(n) have the same order," or g(n) is an
asymptotically tight bound of f(n)."
It is easily seen that f(n) = Q(g(n)) Þ
f(n) = O(g(n)) but not conversely.
Examples:
Basic properties
- Linearity: For any constants c1, c2 > 0,
f1(n) = O(g(n)) and f2(n) = O(g(n)) Þ
c1f1(n) + c2f2(n) = O(g(n)),
f1(n) = Q(g(n)) and f2(n) =
Q(g(n)) Þ c1f1(n)
+ c2f2(n) = Q(g(n)).
Note that the second is true only for eventually positive functions.
- Reflexivity:
f(n) = O(f(n)),
f(n) = W(f(n)),
f(n) = Q(f(n)).
- Symmetry:
f(n) = Q(g(n)) Û g(n) =
Q(f(n)).
- Transitivity:
f(n) = O(g(n)) and g(n) = O(h(n)) Þ f(n) = O(h(n)),
f(n) = W(g(n)) and g(n) = W(h(n))
Þ f(n) = W(h(n)),
f(n) = Q(g(n)) and g(n) = Q(h(n))
Þ f(n) = Q(h(n)).
Hence Q defines an equivalence relation on the set of functions,
while O and W each defines a partial ordering on the
equivalence classes of Q.
More properties:
- f(n) + O(f(n)) = Q(f(n)),
i.e. g(n) = O(f(n)) Þ f(n) + g(n) =
Q(f(n)).
f(n) + Q(f(n)) = Q(f(n)),
i.e. g(n) = Q(f(n)) Þ f(n) + g(n)
= Q(f(n)).
f(n) + W(f(n)) = W(f(n))
i.e. g(n) = W(f(n)) Þ f(n) + g(n)
= W(f(n)).
- f(n) = O(g(n)) (i.e. g(n) = W(f(n)))
Þ O(f(n)) + O(g(n)) = O(g(n)), and
Q(f(n)) + Q(g(n)) = Q(g(n)).
- f1(n) = O(g1(n)) and f2(n) = O(g2(n))
Þ f1(n) + f2(n) = O(max{g1(n),g2(n)}),
f1(n) + f2(n) = O(g1(n) + g2(n)), and
f1(n)f2(n) = O(g1(n)g2(n)).
There are two less popular notations:
|
o(g(n)) =: {f(n):f(n)=O(g(n)) but f(n) ¹ Q(g(n))}. |
|
|
w(g(n)) =: {f(n):f(n)=W(g(n)) but f(n) ¹
Q(g(n))}. |
|
We have the following analogy between ``=" and the asymptotic comparison
of two functions:
Stirling's approximation:
|
n! = | Ö
|
2pn
|
( n / e)n(1 + Q(1 / n)), |
|
which imples that
|
|
lim
n ® ¥
|
log(n!) / (n logn) = 1. |
|
L'Hôpital's rule
If limx ® af(x)=0=limx
® ag(x)
or limx ® af(x)=¥
= limx ® ag(x),then
|
|
lim
x ® a
|
f(x) / g(x) = |
lim
x ® a
|
f¢(x) / g¢(x) |
|
provided the latter limit exists.
Comparison by limits
If c = limn ®
¥[f(n) / g(n)],then
- c = 0 Þ f(n) = O(g(n)) but f(n)
¹ Q(g(n)),
(i.e. f(n) = o(g(n))),
- c = ¥Þ
f(n) = W(g(n)) but f(n) ¹
Q(g(n)), (i.e. f(n) = w(g(n))),
- 0 < c < ¥Þ
f(n) = Q(g(n)).
The following are some useful facts about elementary functions:
- aknk + ¼+
a1n + a0 = O(nk),
aknk + ¼+ a1n +
a0 = Q(nk), if ak ¹ 0.
- a > 0, a ¹ 1, and b >
0 Þ limn ®
¥ logan / nb = 0.
- a > 1 Þ
limn ® ¥
nb / an = 0.
Recurrence relation
A recurrence relation for a function is an equation or inequality
that describes the function in terms of its values on smaller inputs.
Examples are:
- Fibonacci sequence
f(0) = 1, f(1) = 1,
f(n) = f(n-1) + f(n-2), for n
> 1.
- Towers of Hanoi
C(1) = 1,
C(n) = 2C(n-1) + 1, for n > 1.
- Binary Search
T(1) = Q(1),
T(n) = T(ën/2 û) +
Q(1), for n > 1.
Methods for solving recurrence relations exactly:
- Induction. Guess the solution from some terms
and prove the guess by mathematical induction.
- Iteration. Repeatedly expand the recurrence and express the
result (usually) as a summation of terms. Finally find a closed-form
expression for the summation.
- Generating Function. Determine mathematically the function
Then
|
f(n) = ( 1 / n!) F(n)(0). |
|
- Formula. Only special recurrences have formulas
for their solutions, e.g. linear homogeneous recurrence with
constant coefficients.
In this course, recurrence is mostly used in expressing the running time
of algorithms. It only requires to solve a recurrence asymptotically,
i.e. expressing the solution as O(g(n)), Q(g(n)), or W(g(n)),
depending on the application, for some well known function g(n).
That means, we want to bound the function from above and/or from below
by some simple functions.
When O- and W
-notations are used, we always require that the
bounds are as tight as possible.
The induction and iteration methods can also be adopted to solve recurrences
asymptotically.
[Master theorem]
Let a ³ 1, b > 1 and c > 0 be constants, let f(n) be a
function, and let T(n) be defined by
|
|
ì í
î
|
|
|
T(n) = aT(n/b) + f(n), n > 1 |
|
|
|
|
where we interprete n/b to mean either ën/b
û or én/b
ù. Then T(n) can be solved asymptotically as
- If f(n) = O(nlogba-
e) for some constant
e > 0,
then T(n) = Q(nlogba).
- If f(n) = Q(nlogba), then T(n) = Q(nlogbalgn).
- If f(n) = W(nlogba+e) for some constant e > 0,
and if af(n/b) £ cf(n) for some constant c < 1 and all sufficiently
large n, then T(n) = Q(f(n)).
Examples:
- T(n) = 4T(n/2) + n Þ T(n) = Q(n2).
- T(n) = 9T(n/3) + O(n) Þ T(n) = Q(n2).
- T(n) = 2T(n/2) + Q(n) Þ T(n) = Q(n lgn).
- T(n) = 4T(n/2) + n2 Þ T(n) = Q(n2lgn).
- T(n) = 4T(n/2) + n3 Þ T(n) = Q(n3).