Here we will analyze some recursive algorithms and get recurrences that we may solve using the Master Theorem. In each case explain and ve your recurrence to get a Θ solution, that is, a function f(n) that is Θ(T(n)) where T(n) is the actual worst-case running time on inputs of length n.

Remember that because of the Smoothness Theorem we can ignore issues caused by divide-and-conquer procedures not dividing the input into equal pieces.

Questions are in black, solutions in blue.

**Question 1:**The Stoogesort algorithm sorts an array of Comparables. Assume for the discussion that it sorts correctly and analyze its running time.`void stoogesort (Comparable [] a, int i, int j) {// sorts items i through j of array a int size = j-i+1; // number of items to sort int twothirds = 2*size/3; if (size <= 3) { trivialsort(a,i,j); // assume this is Theta(1) return;} stoogesort (a,i,i+twothirds-1); stoogesort (a,j-twothirds+1,j); stoogesort (a,i,i+twothirds-1);}`

Let T(n) be the time needed to stoogesort a range of n elements. The algorithm does Θ(1) instructions excluding the recursive calls, since the number of non-recursive instructions it does is bounded above and below by constants independent of the input size. It makes three recursive calls to stoogesort ranges of size (2n/3), taking T(2n/3) each, so our total time is:

T(n) = 3T(2n/3) + Θ(1), with T(3) = Θ(1)

To use the Master Theorem here we have to extend it beyond the conditions for which Levitin proves it, since to get it in the form "T(n) = aT(n/b) + Θ(n

^{d})" we need b = 3/2 along with a = 3 and d = 0. (Recall that n^{0}=1.) The exact solution will now be valid for*real*numbers n of the form (3/2)^{i}, and the extension of the Smoothness Theorem to get the Θ solution for all n is still valid. We can do this whenever b is strictly greater than 1.Since a is 3 and b

^{d}is 1, we are in the third case of the (extended) Master Theorem and the solution is T(n) = Θ (n^{logba}). The exponent log_{3/2}3 is a real number between 2 and 3, since (3/2)^{2 < 3 < (3/2)3. So stoogesort is asymptotically slower than all the sorting methods we have considered, a fact that inspired Cormen, Leiserson, Rivest and Stein to give it that name (see p. 161). }**Question 2:**The Stoogesearch algorithm finds the target element in a sorted array or returns -1 if it is not there. I've probably messed up the code, but the idea is that it searches either the first 2/3 or last 2/3 of the given range.`int stoogesearch (Comparable target, Comparable [] a, int low, int high) {// returns index of target if it is in a[low..high] int size = high-low+1; if (size == 1) if (a[low].equals(target)) return low; else return -1; else {int twothirds = 2*size/3; if (a[high-twothirds].compareTo(target) < 0) stoogesearch(target, a, high-twothirds+1, high); else stoogesearch(target, a, low, low+twothirds-1);}}`

This time we let T(n) be the time to find an element of a range of n elements using stoogesearch. There are Θ(1) non-recursive instructions plus

*one*recursive call. (There are two recursive calls in the code but only one will be made on any particular run of stoogesort because they are in different branches of an if-else block.) Each recursive call is to a range 2/3 as large as the original, so we have:T(n) = T(2n/3) + &Theta(1) with T(1) = Θ(1)

Extending the Master Theorem as above to cover non-integer b, we find that a = 1, b = 3/2, and d = 1. In this case b

^{d}= 1 = a, so we are in the second case of the Master Theorem and the answer is:T(n) = Θ(n

^{d}log n) = Θ(log n) So despite the unnecessary work done by stoogesearch in its else block (where it could have searched a range of size only n/3), stoogesearch has the same asymptotic performance as ordinary binary search.**Question 3:**Here is a recursive algorithm for the closest pair problem, given as pseudocode. Analyze its running time.`Given a set S of n points If n <= 3 check all pairs and return closest, else Divide S into two sets A and B of n/2 points each Check each pair (a,b) with a in A and b in B and find the closest Find the closest pair within A recursively Find the closest pair within B recursively Return the closest of the three pairs you've found`

We let T(n) be the time to find the closest pair among n points using this algorithm. Here there are two recursive calls, each on a set of n/2 points and thus taking time T(n/2) each. But the non-recursive part of the algorithm compares each of the n/2 points in A with each of the n/2 points in B, for n

^{2}/4 comparisons and Θ(n^{2}) total time. Our recurrence this time is:T(n) = 2T(n/2) + Θ(n

^{2}), T(3) = Θ(1)This is an instance of the ordinary Master Theorem with a = 2, b = 2, and d = 2. Since b

^{d}= 4 > a, we are in the first case and the answer is T(n) = Θ(f(n)) = &Theta(n^{2}).

Last modified 25 September 2003