Entropy of a Message Using Random Variables

Using the second law of thermodynamics, it is possible to use random variables to calculate the information entropy (or Shannon entropy) of a message, which is a measure of the amount of information in the message. The probabilities that and occur in the message are and .
is the entropy of a message . The random message is generated with the symbols and equally likely, and the message's entropy is computed assuming the value of from the slider.



  • [Snapshot]
  • [Snapshot]
  • [Snapshot]


C. E. Shannon and W. Weaver, The Mathematical Theory of Communication, Urbana, IL: University of Illinois Press, 1963.
    • Share:

Embed Interactive Demonstration New!

Just copy and paste this snippet of JavaScript code into your website or blog to put the live Demonstration on your site. More details »

Files require Wolfram CDF Player or Mathematica.