In probability theory and statistics, the expected value of a random variable (represented by the symbol [1]) is the average value the variable will take, that is, assuming that the experiment is repeated an infinite number of times, and the mean (or weighted average) of all the values is calculated along the way.[2]
By definition, the expected value of a discrete random variable is calculated by the formula , where is the probability that equals , and ranges over all possible values of .[3]
The Law of large numbers describes how this happens.