CMPSCI 240: Reasoning About Uncertainty
David Mix Barrington
Fall, 2009
Homework Assignment #4
Posted Wednesday 7 October 2009
Due on paper in class, Monday 19 October 2009
There are seven questions for 50 total points plus 10 extra credit. Most are from the textbook, Mathematical Foundation for Computer Science. Note that the book has both Exercises and Problems -- make sure you are doing a Problem and not the Exercise with the same number. The number in parentheses following each problem is its individual point value.
Students are responsible for understanding and following the academic honesty policies indicated on this page.
- Problem 10.5.3 (10) This involves some fooling with the sequence to get a sum of two sequences, each of which you know how to handle given only the fact that the sum for i from 0 to infinity of 1/i! is equal to e.
- Problem 10.6.1 (10) You can use the Bernoulli Trial rule to determine the overall probability, but the subcases based on how many games the series lasts are a little more complicated, because the series ends when one team gets to four games.
- Problem 10.6.3 (10) Once again your code need not compile, but trying to compile it might help you get it right, and you might find the results interesting. You are not expected to use a random number generator to simulate the seasons, though that might be a more efficient way to get a good approximation to the answer. What I want here is for you to cycle over all i from 0 to 162, find the probability that A wins exactly i games, and then find the probability that B wins fewer than i games. This will involve adding together a lot of Bernoulli Trial probabilities.
- Excursion 10.7, Writing Exercise 1 (5)
- Excursion 10.7, Writing Exercise 3 (5)
- Problem 10.8.5 (10) "Designing a random variable" means giving a finite event set, a probability for each event in the set, and a value for each event in the set. The values and probabilities will be functions of the given variables μ, σ, and ε.
- Problem X-2 (10 extra credit): Suppose you have a single possibly unfair coin where the probability p of throwing heads each time is unknown to you. Every throw of this coin is independent and has probability p of heads and 1-p of tails. How can you use this coin to simulate a fair coin? Describe a procedure using the unfair coin that gives you two results (which you can report as "heads" and "tails") with equal probability.
Last modified 7October 2009