The expectation or expected value of a random variable is given by . This is essentially an average value of weighted by probability, and it only makes sense if takes on values that can be summed in this way (e.g., real or complex values, or vectors in a real- or complex-valued vector space). Even if the expectation makes sense, it may be that a particular random variable does not have an expectation, because the sum fails to converge.
For example, if and are two independent fair six-sided dice, then
(8.4) |
(8.5) |
The fact that here is not a coincidence.
The choice of appropriate random variable lies at the heart of the analysis of any randomized algorithm. An idea which is most often used is the linearity of expectations. Let and be random variables. Then we have the following equality:
(8.6) |
It is important to note that this equation does not require independence between and . This idea lets us simplify calculations that would be quite complex otherwise.
Solution: To answer this question, consider the random variables: , where for each , , we define if th person gets the correct letter, 0 otherwise. Since the cards were distributed randomly, the probability that the person gets the correct letter is . Therefore, we have: for all . Finally, the expected number of people who get correct letters is given by . Note that there was no assumption made whatsoever about the first letters of peoples’ names.
Why Randomization works? The reason why randomized algorithms are successful in finding the solution is that often the problem instance has a lot of witnesses for yes or no. For example, in the case of checking whether a multi-variate polynomial is the zero polynomial, when the input polynomial is indeed a zero polynomial, then it evaluates to zero at all the points. When the polynomial is not zero, then there are a lot of points where the polynomial is non-zero. Hence, finding one witness out of the many is achieved easily by random sampling.