Expected value

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
Rolling a dice, the mean converges to the expected value of 3.5

In probability theory and statistics, the expected value of a random variable in an experiment is the value the variable will take, if the experiment is repeated an infinite number of times and the mean (or weighted average) of all the values is calculated. The Law of large numbers describes how this happens.