We think of a randomized algorithm as a machine (or function) that computes , where is the input and is the sequence of random bits. The machine model is the usual RAM model, where we have a memory space that is typically polynomial in the size of the input , and in constant time we can read a memory location, write a memory location, or perform arithmetic operations on integers of up to bits. For example, if , then we represent it by bits.
In this model, we may find it easier to think of the random bits, supplied as needed by some subroutine, where generating a random integer of size takes constant time. The justification for this assumption is that it takes constant time to read the next -sized value from the random input.
Because the number of these constant-time operations, and thus the running time for the algorithm as a whole, may depend on the random bits, it is now a random variable - a function on points in some probability space. The probability space consists of all possible sequences , each of which is assigned a probability (typically , and the running time for on some input is generally given as an expected value , where for any ,
(8.1) |
We can now quote the performance of in terms of this expected value: where we would say that a deterministic algorithms runs in time , where is the size of the input, we instead say that our randomized algorithm runs in expected time , which means that for all inputs .
This is distinct from traditional worst-case analysis, where there is no expectation, and average-case analysis. The following trivial example shows the distinction.
Solution. Let sample space . A random variable , for coin coming as heads, i.e., event . So,
(8.2) |
The expected number of heads obtained in flip of a coin is,
So the expected value of random variable associated with an event is probability that event occurs. Using of expected value is useful when repeated trials are performed. For example, to compute the number of heads in coin flips by considering separately, the probability of heads, 1 heads, 2 heads, etc. Let is random variable associated with the event in which the -th flip comes up heads. Let is the random variable denoting total number of heads in the coin flip. So,
(8.3) |
To compute expected number of heads, we take expectation in both sides,