Definition

The expected value (or mean) of a random variable describes its average value.

If is discrete:

If is continuous:

More generally, for a function :

discrete case:

continuous case:

Conditional expectation on an event

If is an event with , then the conditional expectation of given is

in the discrete case.

It is the expected value of under the condition that the event has occurred.

Law of total expectation for a partition

Let be pairwise disjoint events such that

Then

This is the expectation analogue of the law of total probability: split the sample space into cases, compute the conditional expectation in each case, and weight by the probability of the case.

Example

Suppose:

  • with probability , we choose a fair coin and let be the number of heads in one toss
  • with probability , we choose a fair die and let be the number shown

Let:

  • = “coin was chosen”
  • = “die was chosen”

Then

and therefore

NOTE

If is non-negative and integer-valued, then

Short proof. Since is integer-valued,

Now write each as , so

Exchanging the order of summation gives

Linearity of expectation

Expected value is linear. For random variables and constants ,

Short proof. In the discrete case,

Distributing the sum gives

This does not require the random variables to be independent.

Moments

The -th moment of a random variable is

provided the expectation exists.

  • The first moment is , the mean.
  • The second moment is .

Often one also uses centered moments, defined by

The most important centered moment is the second one:

So moments describe the shape of a distribution:

  • the first moment gives its location,
  • the second centered moment gives its spread,
  • higher moments capture finer properties such as asymmetry and tail behavior.