expecting

In probability theory, the expected value of a random variable is a generalization of the weighted average and intuitively is the arithmetic mean of a large number of independent realizations of that variable. The expected value is also known as the expectation, mathematical expectation, mean, average, or first moment.
By definition, the expected value of a constant random variable



X
=
c


{\displaystyle X=c}
is



c


{\displaystyle c}
. The expected value of a random variable



X


{\displaystyle X}
with equiprobable outcomes



{

c

1


,

,

c

n


}


{\displaystyle \{c_{1},\ldots ,c_{n}\}}
is defined as the arithmetic mean of the terms




c

i


.


{\displaystyle c_{i}.}
If some of the probabilities



Pr

(
X
=

c

i


)


{\displaystyle \Pr \,(X=c_{i})}
of an individual outcome




c

i




{\displaystyle c_{i}}
are unequal, then the expected value is defined to be the probability-weighted average of the




c

i




{\displaystyle c_{i}}
s, i.e. the sum of the



n


{\displaystyle n}
products




c

i



Pr

(
X
=

c

i


)


{\displaystyle c_{i}\cdot \Pr \,(X=c_{i})}
.
Expected value of a general random variable involves integration in the sense of Lebesgue.

View More On Wikipedia.org
  • 1

    josher

    Well-known Member
    • Posts
      6,745
    • Likes
      176
    • Points
      0
  • Back
    Top