Questions are in black, answers in blue.
TRUE. For sufficiently large n, we know that f(n) ≥ ch(n) and g(n) ≥ dh(n) for two positive constants c and d. For such n, it follows that (f+g)(n) = f(n) + g(n) ≥ (c+d)h(n). Since c+d is a positive constant and (f+g)(n) ≥ (c+d)h(n) for sufficiently large n, the function f+g is in the class &Omega(h);.
FALSE. Many of the conditions of the Smoothness Theorem hold, but not all -- the function f(n) = 2n does not satisfy the condition that f(cn) = Θ(f(n)). Suppose T(n) were the largest possible nondecreasing function meeting the conditions, which is 22k where k is the ceiling of log n. Then if n is 2k-1+1, 2k is equal to 2(n-1) and T(n) is 22(n-1). For these n T(n)/2n is equal to 2n-1 and this ratio is not bounded by a constant as n increases. Thus in this example the statement "T(n) = O(2n)" is false.
T(n) = 2T(n-1) - T(n-2) for n ≥ 2; T(0) = 3; T(1) = 3
Checking small cases by forward substitution, we find T(2) = 6 - 3 = 3, T(3) = 6 - 3 = 3, and T(4) = 6 - 3 = 3. The apparent pattern is that T(n) = 3 for all non-negative n. Certainly this is true for n=0 and n=1 by the definition. We take as inductive hypothesis that T(n-1)=3 and T(n-2)=3 and calculate that T(n) = 2(3) - 3 = 6 - 3 = 3. This proves the inductive goal and we may conclude that T(n) = 3 for all n.
The idea is simply to check, for each element in the array, how
many elements are less than it. In the worst case, we will carry
out this count for all n elements (if the last one is the median) and
the counting takes &Theta(n); time per element, for a total of &Theta(n2
).
Comparable median (Comparable[] A) {
int n = A.length;
for (int i=0; i < n; i++) {
int count = 0;
for (int j=0; j < n; j++)
if (A[j].compareTo(A[i]) < 0) count++;
if (count == n/2) return A[i];}
throw new Exception("must be duplicate elements");}
We take Θ(n) time for each possible median we consider. The number of possible medians we consider is equally likely to be any number from 1 to n, so the average number we consider is (1/n)(1+2+3+...+n) = (n+1)/2. Thus the average-case time is Θ(n2).
Sort the elements of A in Θ(n log n) time using mergesort. Then return the element A[n/2], which is greater than the elements A[0] through A[n/2 - 1] and no others, hence greater than exactly n/2 elements. The time in all is the time for mergesort plus O(1) more, thus Θ(n log n).
Let T(n) be the time to sort n items with this version of quicksort.
Because we use the exact median, the two recursive calls take T(n/2) each.
(Actually this is only exactly true if n is odd, otherwise we get
T(n/2) and T(n/2-1). We can let n be one less than a power of two so the
subranges sorted are always of odd size. The overhead is dominated by the
Θ(n2)
to find the median, so the recurrence is:
T(n) = 2T(n/2) + Θ(n2), and the solution by the Master
Theorem is T(n) = Θ(n2) since a=2, b=2, d=2, and a <
bd.
Last modified 1 October 2003