Expected value

In probability theory, the expected value (also called expectation, expectancy, expectation operator, mathematical expectation, mean, expectation value, or first moment) is a generalization of the weighted average. Informally, the expected value is the mean of the possible values a random variable can take, weighted by the probability of those outcomes. Since it is obtained through arithmetic, the expected value sometimes may not even be included in the sample data set; it is not the value you would "expect" to get in reality.

The expected value of a random variable with a finite number of outcomes is a weighted average of all possible outcomes. In the case of a continuum of possible outcomes, the expectation is defined by integration. In the axiomatic foundation for probability provided by measure theory, the expectation is given by Lebesgue integration.

The expected value of a random variable X is often denoted by E(X), E[X], or EX, with E also often stylized as or E.[1][2][3]

  1. ^ "Expectation | Mean | Average". www.probabilitycourse.com. Retrieved 2020-09-11.
  2. ^ Hansen, Bruce. "PROBABILITY AND STATISTICS FOR ECONOMISTS" (PDF). Archived from the original (PDF) on 2022-01-19. Retrieved 2021-07-20.
  3. ^ Wasserman, Larry (December 2010). All of Statistics: a concise course in statistical inference. Springer texts in statistics. p. 47. ISBN 9781441923226.

Expected value

Dodaje.pl - Ogłoszenia lokalne