Строительный блокнот  Automata methods and madness 

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 [ 162 ] 163 164 165 166 167 168 169 170 171 172 173 174 175

Here is a summary of liow M behaves on an input string w of Os and Is. In the start state, (jo, M looks at the first random bit, and makes one of two tests regarding w, depending on whether that random bit is 0 or 1.

If the random bit is 0, then M tests whether or not w consists of only one symbol - 0 or 1. In this case, M looks at no more random bits, but keeps its second tape head stationary. If the first bit of w is 0, then M goes to state qi. In that state, M moves right over Os, but dies if it sees a 1. If M reaches the first blank on the input tape while in state , it goes to state 4, the accepting state. Similarly, if the first bit of is 1, and the first random bit is 0, then M goes to state qz; in that state it checks if all the other bits of w are 1, and accepts if so.

Now, let us consider what M does if the first random bit is 1. It compares w with the second and subsequent random bits, acceptmg only if they are the same. Thus, in state qo, scanning 1 on the second tape, M goes to state qs. Notice that when doing so, M moves the random-tape head right, so it gets to see a new random bit, wlule keeping the input-tape head stationary so all of w will be compared with random bits, bi state дз, M matclies the two tapes, moving both tape heads right. If it finds a mismatch at some point, it dies and fads to accept, while if it reaches the blank on the input tape, it accepts.

Now, let us compute the probability of acceptance of certain inputs. First, consider a homogeneous Input, one that consists of only one symbol, such as 0* for some i > 1. With probability 1/2, the first random bit will be 0, and if so, then the test for homogeneity will succeed, and 0* is surely accepted. However, also with probability 1/2 the first random bit is 1. In that case, 0 will be accepted if and only if random bits 2 through г И-1 are all 0. That occurs with probability 2~. Thus, the total probability of acceptance of 0 is

Now, consider the case of a heterogeneous input №, i.e., an input that consists of both Os and Is, such as 00101. This input is never accepted if the first random bit is 0. If the first random bit is 1, then its probability of acceptance is 2~, where i is the length of the input. Thus, the total probability of acceptance of a heterogeneous input of length i is 2 + For instance, the probability of acceptance of 00101 is 1/64, □

Our conclusion is that we can compute a probability of acceptance of any given string by any given randomized TM. Whether or not the string is in the langnage depends on how membership in the language of a randomized TM is defined. We shall give two different definitions of acceptance in the next sections; each leads to a different class of languages.

11.4.4 The Class TIP

The essence of our first class of languages, called TZV, for random polynomial, is that to be in TZF, a language L must be accepted by a randomized TM M



Nondeterminism and Randomness

There are some superficial similarities between a randomized TM and a nondeterministic TM. We could imagine that the nondeterministic choices of a NTM are governed by a tape with random bits, and every time the NTM has a choice of moves it consults the random tape and picks from among the choices with equal probability. However, if we interpret an NTM that way, then the acceptance rule is rather different from the rule for TZV. Instead, an input is rejected if its probability of acceptance is 0, and the input is accepted if its probability of acceptance is any value greater than 0, no matter how small.

in the following sense:

1. If u) is not in L, then the probability that M accepts w is 0.

2. If w is in L, then the probability that M accepts w is at least 1/2.

3. There is a polynomial T{n) such that if input w is of length n, then all runs of M, regardless of the contents of the random tape, halt after at most T{n) steps.

Notice that there are two independent issues addres-sed by the definition of TZV. Points (1) and (2) define a randomized Turing machine of a special type, which is sometimes called a Monte-Carlo algorithm. That is, regardless of running time, we may say that a randomized TM is Monte-Carlo if it either accepts with probability 0 or accepts with probability at least 1/2, with nothing in between. Point (3) simply addresses the running time, which is independent of whether or not the TM is Monte-Carlo.

Example 11.14: Consider the randomized TM of Example 11.13. It surely satisfies condition (3), since its running time is 0{n) regardless of the contents of the random tape. However, it does not accept any language at all, in the sense required by the definition of TIV. The reason is that, while the homogeneous inputs like ООО are accepted with probability at least 1/2, and thus satisfy point (2), there are other inputs, fike 001, that are accepted with a probability that is neither 0 nor at least 1/2; e.g., 001 is accepted with probability 1/16, □

Example 11.15: Let us describe, informally, a randomized TM that is both polynomial-time and Monte-Carlo, and therefore accepts a language in TiV. The input will be interpreted as a graph, and the question is whether the graph has a triangle, that is, three nodes all pairs of which are connected by edges. Inputs with a triangle are in the language; others are not.



V e(n-2)/

There is a commonly used approximation that says for small x, (1 - ж)* is approximately e , where e = 2,718 is the base of the natural logarithms. Thus, if we pick к such that kx = 1, for example, e~* will be significantly less than 1/2 and 1 - e will be significantly greater than 1/2, about 0.63, to be more precise. Thus, we can pick к = e{n - 2)/3 to be sure that the probability of acceptance of a graph with a triangle, as given by Equation 11,4, is at least 1/2. Thus, the algorithm described is Monte-Carlo.

Now, we must consider the running time of the TM. Both e and n are no greater than the input length, and к was chosen to be no more than the square of the length, since it is proportional to the product of e and n. Each experiment, since it scans the input at most four times (to pick the random edge and node, and then to check the presence of two more edges), is linear in the input length. Thus, the TM halts after an amount of time that is at most cubic in the input length; i.e., the TM has a polynomial rmining time and therefore satisfies the third and final condition for a language to be in TtV.

We conclude that the language of graphs with a triangle is in the class TVP. Note that this language is also in since one could do a systematic search of all possibilities for triangles. However, as we mentioned at the beginning of Section 11.4, it is actually hard to find examples that appear to be in TZV - V. □

11.4.5 Recognizing Languages in HV

Suppose now that we have a polynomial-time, Monte-Carlo Turing machine M to recognize a language L. We are given a string w, and we want to know if

The Monte-Carlo algorithm will repeatedly pick an edge (x, y) at random and pick a node z, other than x and y, at random as well. Each choice is determined by looking at some new random bits from the random tape. For each X, y, and г selected, the TM tests whether the input holds edges [x, z) and (y, z), and if so it declares that the input graph has a triangle.

A total of к choices of an edge and a node are made; the TM accepts if any one of them proves to be a triangle, and if not, it gives up and does not accept. К the graph has no triangle, then it is not possible that one of the к choices will prove to be a triangle, so condition (1) in the definition of TIV is met: if the input is not in the language, the probability of acceptance is 0.

Suppose the graph has n nodes and e edges. If the graph has at least one triangle, then the probability that its three nodes will be selected on any one experiment is (f)(;). That is, three of the e edges are in the triangle, and if any of these three are picked, then the probability is \j{n - 2) that the third node wUl also be selected. That probability is small, but we repeat the experiment к times. The probability that at least one of the к experiments will yield the triangle is:



1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 [ 162 ] 163 164 165 166 167 168 169 170 171 172 173 174 175