Entropy of a Message Using Random Variables
Using the second law of thermodynamics, it is possible to use random variables to calculate the information entropy (or Shannon entropy) of a message, which is a measure of the amount of information in the message. The probabilities that and occur in the message are and .[more]
is the entropy of a message . The random message is generated with the symbols and equally likely, and the message's entropy is computed assuming the value of from the slider.[less]
C. E. Shannon and W. Weaver, The Mathematical Theory of Communication, Urbana, IL: University of Illinois Press, 1963.