The Shannon entropy formula is a fundamental concept in information theory introduced by Claude Shannon. It quantifies the amount of uncertainty or information content associated with a random variable. The formula is expressed as:
where is the entropy of the random variable , is the probability of occurrence of the -th outcome, and is the base of the logarithm, often chosen as 2 for measuring entropy in bits. The negative sign ensures that the entropy value is non-negative, as probabilities range between 0 and 1. In essence, the Shannon entropy provides a measure of the unpredictability of information content; the higher the entropy, the more uncertain or diverse the information, making it a crucial tool in fields such as data compression and cryptography.
Start your personalized study experience with acemate today. Sign up for free and find summaries and mock exams for your university.