Entropy of a Message Using Random Variables

Initializing live version
Download to Desktop

Requires a Wolfram Notebook System

Interact on desktop, mobile and cloud with the free Wolfram Player or other Wolfram Language products.

Using the second law of thermodynamics, it is possible to use random variables to calculate the information entropy (or Shannon entropy) of a message, which is a measure of the amount of information in the message. The probabilities that and occur in the message are and .

[more]

is the entropy of a message . The random message is generated with the symbols and equally likely, and the message's entropy is computed assuming the value of from the slider.

[less]

Contributed by: Daniel de Souza Carvalho (March 2011)
Open content licensed under CC BY-NC-SA


Snapshots


Details

C. E. Shannon and W. Weaver, The Mathematical Theory of Communication, Urbana, IL: University of Illinois Press, 1963.



Feedback (field required)
Email (field required) Name
Occupation Organization
Note: Your message & contact information may be shared with the author of any specific Demonstration for which you give feedback.
Send