# Practice Exam for Final Exam

### Directions:

• Answer the problems on the exam pages.
• There are ?? problems on pages 2-?, for 150 total points. Probable scale is A=140, C=80.
• If you need extra space use the back of a page.
• No books, notes, calculators, or collaboration.
• The first six questions are true/false, with five points for the correct boolean answer and up to five for a correct justification.
• Questions 5 and 6 have numerical answers -- remember that logarithms are base 2.

```  Q1: 10 points
Q2: 10 points
Q3: 10 points
Q4: 10 points
Q5: 10 points
Q6: 10 points
Q7: 40 points
Q8: 20 points
Q9: 30 points
Total: 150 points
```

• Question 1 (10): True or false with justification: Let X be a random source that produces a digit from the set {0,1,2,3,4,5,6,7, 8,9}, where each digit has probability 1/10. Let p be the probability that four digits, taken independently from X, are all different. Then p is greater than 1/2.

• Question 2 (10): True or false with justification: It is possible to design a Turing machine that inputs a string w over the alphabet {a,b,...,z} and finds a variable-length binary code that minimizes the length of the encoding of w.

• Question 3 (10): True or false with justification: Let X be a discrete random source and Y the output of a memoryless channel when X is the input to it, where the values of X and Y are both always integers. Suppose that X and Y always satisfy the rule X + Y = 6. Then the equivocation of Y with respect to X is 0.

• Question 4 (10): True or false with justification: Let Q be the language over the alphabet {a,b,c} consisting of all strings where the number of a's equals the number of c's. Then Q is a regular language.

• Question 5 (10): True or false with justification: The language Q of Question 4 is Turing decidable.

• Question 6 (10): True or false with justification: Let R be the set {M: M is the description of a Turing machine and L(M) is a Turing recognizable language}. Then R itself is Turing recognizable but is not Turing decidable.

• Question 7 (40): Let N be a λ-NFA with state set {1,2,3}, start state 1, only final state 3, and four transitions: (1,λ,2), (2,a,2), (2,b,2), and (2,b,3).
• (a,10) Using our given construction, create an ordinary NFA N' with the same state set and the same language as N.
• (b,10) Using the subset construction, find a DFA D with the same language as N'.
• (c,10) Using the state minimization construction, find the minimal DFA D' for D.
• (d,10) Using the construction from lecture, find a regular expresssion for the language of D'.

• Question 8 (20): Let f(n) be the probability that a uniformly-chosen string of length n from the alphabet {a,b,c} is in the language Q from Question 4. Compute f(0), f(1), f(2), f(3), and f(4). What is the limit, as n goes to infinity, of f(n)?

• Question 9 (30): Suppose we take n successive bits from a source Z that is not memoryless. The first bit b1 is equally likely to be 0 or 1, but each succeeding bit bi+1 is equal to bi with probability 3/4 and different from it with probability 1/4, the events "bi+1 is different from bi" for different i being independent.
• (a,10) What is the entropy of each bit bi if all previous bits are known? (The answer may not be the same for each i.) What is the joint entropy of the first n bits from this source, for arbitrary n?
• (b,10) Suppose we view 2n bits from Z as n letters from the alphabet {00,01,10,11}. What is the probability of the i'th of these letters taking on each of these four values, without any assumption about the previous letters? (In this case the answer does not depend on i.) What is the entropy of this distribution, assuming that log 3 = 1.6?
• (c,10) If we send the two-bit letters using a variable-length binary code optimized for the distribution computed in (b), what is the expected number of bits we will need to send n letters?