The Chebyshev Inequality is a fundamental result in probability theory that provides a bound on the probability that a random variable deviates from its mean. It states that for any real-valued random variable with a finite mean and a finite non-zero variance , the proportion of values that lie within standard deviations from the mean is at least . Mathematically, this can be expressed as:
for . This means that regardless of the distribution of , at least of the values will fall within standard deviations of the mean. The Chebyshev Inequality is particularly useful because it applies to all distributions, making it a versatile tool for understanding the spread of data.
Start your personalized study experience with acemate today. Sign up for free and find summaries and mock exams for your university.