# Questions and Answers on Homework Assignment #6

#### HW#4 is due on paper in class, Monday 12 April 2010.

Question text is in black, my answers in blue.

First, make sure you note the revisions made to HW#6 on 4 April.

• Question 6.1, posted 8 April 2010:

I have a solution to 7.10 but it doesn't look right. Let G have three vertices s, t, and u, with edges (s,t), (s,u), and (u,u). Then a random walk reaches t in one step with probability 1/2 and never reaches t at all with probability 1/2, and the expected time to reach t is infinite, hence at least 2n for any n. Is this legitimate?

No, I think the clear intent of saying that the expected path length is Ω(2n) is that the expected value should exist, that is, that a random walk will reach t eventually with probability 1. It's easy to design a graph with this property -- the main difficulty of the problem is to analyze the expected time to reach t. Remember that this is defined to be the sum over all i of (i * Pr(length = i)). If your paths that don't reach t always return to s, you can set up an equation in which the expected time from s to t occurs on both sides...

• Question 6.2, posted 8 April:

If I have a log-space reduction from A to B, can I assume that the length of f(x) depends only on the length of x? This would make my proof of 6.15 much simpler.

You can't assume this for a general f -- a log-space function might give you output of any length bounded by a polynomial in |x|. Thus if you have circuits for B and you want to use them and f to build circuits for A, your size-n circuit for A might have to use B circuits for many different sizes. If you're dealing with NC, this isn't much of a problem because a polynomial number of circuits of poly size are still poly size taken together, and if they are all poly-log depth your resulting circuit is still poly-log depth if you are placing them all in parallel.

In class yesterday I mentioned a trick that might help you. For any language B over an alphabet not containing "#", let B' be {w#i: w ∈ B and i ≥ 0}. If f log-space reduces A to B, there is a function f' that log-space reduces A to B' and has the length consistency property you mention. All you have to show to use this is that if B is in NC, so is B'.

• Question 6.3, posted 8 April:

What's the point of these one-look ATM's again?

They came up in the proof (sadly omitted in [AB]) that ATISP (logi n, log n) = NCi and A-ALT-SP (logi n, log n) = ACi. These proofs relate circuits to the configuration graphs of ATM. The former have the property that the connections among nodes depend only on the size of the input, while the latter's connections depend on the input content itself. I find it easiest to alter the ATM so that its connections only depend on the size of the input.

Can you give an example of a one-look ATM?

Sure. Let A be the language of binary strings with at least one 1, or Σ*1&Sigma*. Here is a one-look ATM M, with L(M) = A, running in O(log n) time and O(log n) space -- this proves that A is in one-look ATISP(log n, log n).

White, the existential player, writes a number i of log n bits on the tape. The ATM then makes its one look, at the input bit xi. White wins (and hence x ∈ A) if this bit is 1, otherwise Black wins.

What does the configuration graph of that ATM look like?

I'm glad you asked. It has an output node corresponding to the start configuration, a binary tree of OR nodes corresponding to the White-control configurations where i is partially written on the tape, and a leaf node for each i querying xi. These leaves correspond to configurations where i is fully written on the tape and we check its value to decide the winner. This tree has depth O(log n), proving that A is also in the circuit class NC1.

How about an example with more alternations?

Ok, let P be the language of binary strings with an odd number of 1's. Here is a one-look ATM M' with L(M') = P, such that M' runs in O(log n) time and has O(log n) alternations. At the start of the game, White claims that the input string has an odd number of 1's. White then says whether the first half has an odd or even number, and thus whether the second half has an odd or even number. (If White's claims are inconsistent with prior claims, she loses.) Black then writes a bit indicating whether he challenges the claim about the first or second half. White then makes claims about the first and second half of the challenged half, Black picks one of these two claims to challenge, and so on. Once the disputed section of the string has been reduced to a single bit, the machine looks at this bit in the input. White wins if this bit matches her claim about whether it has an odd or even number of 1's, and Black wins if it does not. Note that if we pad the input length to a power of two with 0's, the log n moves of Black, read as a string, are exactly the index of the bit under dispute. Each round of this game has O(1) moves and O(1) alternations, so the whole run of the ATM has O(log n) of each.

I'm still confused about the line of the problem statement that says "White makes a prediction for the input bit (which may be constrained by the previous play) and wins the game iff this is correct." Don't the two players know the whole input? Why can't White just "guess" the correct value of the input bit?

Yeah, it would be better if I had said "is constrained by the previous play". In the parity game above, White's earlier moves determine whether she is claiming that the final bit is 1 or 0. In the "at least one 1" game, White always claims that the final bit is 1.

Note that the "circuit game", in the proof that P is contained in AL, is a one-look game. The players begin with a dispute about the value of the output node (White says 1, Black says 0) and as the game progresses the dispute moves to lower and lower nodes in the circuit, until it reaches a leaf. At that point White is left making a claim about that particular input bit, and wins the game iff this claim is correct.

Question 6.4, posted 11 April:

May I solve F-2 by using the Alternation-Circuit Theorem to turn my original ATM into a circuit, and then building a one-look ATM to evaluate the circuit?

NO! NO! NO! Sorry for the shouting, but the entire point of this homework problem is that I used one-look machines in lecture to prove the Alternation-Circuit Theorem. My proof there was not complete without the simulation that you are providing with Problem F-2, so it would be circular to use that theorem to prove F-2.