Questions are in black, answers in blue.

**Question 1 (10):**(true/false with justification) If f(n) and g(n) are both increasing functions, each of which is in Ω(h(n)), then the function f(n) +g(n) is also in Ω(h(n)).TRUE. For sufficiently large n, we know that f(n) ≥ ch(n) and g(n) ≥ dh(n) for two positive constants c and d. For such n, it follows that (f+g)(n) = f(n) + g(n) ≥ (c+d)h(n). Since c+d is a positive constant and (f+g)(n) ≥ (c+d)h(n) for sufficiently large n, the function f+g is in the class &Omega(h);.

**Question 2 (10):**(true/false with justification) Let T(n) be a nondecreasing function such that whenever n = 2^{k}, T(n) = 2^{n}. (Thus for these n's, T(n) = 2^{2k}.) Then T(n) is in Θ(2^{n}).FALSE. Many of the conditions of the Smoothness Theorem hold, but not all -- the function f(n) = 2

^{n}does not satisfy the condition that f(cn) = Θ(f(n)). Suppose T(n) were the largest possible nondecreasing function meeting the conditions, which is 2^{2k}where k is the ceiling of log n. Then if n is 2^{k-1}+1, 2^{k}is equal to 2(n-1) and T(n) is 2^{2(n-1)}. For these n T(n)/2^{n}is equal to 2^{n-1}and this ratio is*not*bounded by a constant as n increases. Thus in this example the statement "T(n) = O(2^{n})" is false.**Question 3 (20):**Solve the following recurrence exactly and prove by induction that your answer is correct:T(n) = 2T(n-1) - T(n-2) for n ≥ 2; T(0) = 3; T(1) = 3

Checking small cases by forward substitution, we find T(2) = 6 - 3 = 3, T(3) = 6 - 3 = 3, and T(4) = 6 - 3 = 3. The apparent pattern is that T(n) = 3 for all non-negative n. Certainly this is true for n=0 and n=1 by the definition. We take as inductive hypothesis that T(n-1)=3 and T(n-2)=3 and calculate that T(n) = 2(3) - 3 = 6 - 3 = 3. This proves the inductive goal and we may conclude that T(n) = 3 for all n.

**Question 4 (40):**Let A be an array of n Comparables, no two of them equal. The*median*of A is defined to be the element that is greater than exactly n/2 other elements. (Here "n/2" means the floor of n/2 as in Java integer division.)- (a,20) Describe an algorithm to determine the median by brute
force, according to the definition above. This algorithm should have
worst-case running time of Θ(n
^{2}) -- if it is faster than that, then it is not "brute-force" but you may use it for part (c). Pseudocode is fine if it is perfectly clear what your algorithm does.The idea is simply to check, for each element in the array, how many elements are less than it. In the worst case, we will carry out this count for all n elements (if the last one is the median) and the counting takes &Theta(n); time per element, for a total of &Theta(n

^{2 }).`Comparable median (Comparable[] A) { int n = A.length; for (int i=0; i < n; i++) { int count = 0; for (int j=0; j < n; j++) if (A[j].compareTo(A[i]) < 0) count++; if (count == n/2) return A[i];} throw new Exception("must be duplicate elements");}`

- (b,10) What is the average-case running time of your algorithm,
assuming that the median is equally likely to be in any position of A?
Justify your answer.
We take Θ(n) time for each possible median we consider. The number of possible medians we consider is equally likely to be any number from 1 to n, so the average number we consider is (1/n)(1+2+3+...+n) = (n+1)/2. Thus the average-case time is Θ(n

^{2}). - (c,10) Briefly indicate a method to find the median that is
asymptotically faster than Θ(n
^{2}) (that is, its running time is not Ω(n^{2})). Justify your answer.Sort the elements of A in Θ(n log n) time using mergesort. Then return the element A[n/2], which is greater than the elements A[0] through A[n/2 - 1] and no others, hence greater than exactly n/2 elements. The time in all is the time for mergesort plus O(1) more, thus Θ(n log n).

- (a,20) Describe an algorithm to determine the median by brute
force, according to the definition above. This algorithm should have
worst-case running time of Θ(n
**Question 5 (20):**Consider a version of Quicksort that uses the brute-force median algorithm of Question 4 to find its pivot. Write a recurrence for the running time of this algorithm and solve it (in Θ terms) using the Master Theorem.Let T(n) be the time to sort n items with this version of quicksort. Because we use the exact median, the two recursive calls take T(n/2) each. (Actually this is only

*exactly*true if n is odd, otherwise we get T(n/2) and T(n/2-1). We can let n be one less than a power of two so the subranges sorted are always of odd size. The overhead is dominated by the Θ(n^{2}) to find the median, so the recurrence is:T(n) = 2T(n/2) + Θ(n

^{2}), and the solution by the Master Theorem is T(n) = Θ(n^{2}) since a=2, b=2, d=2, and a < b^{d}.

Last modified 1 October 2003