Prediction and Entropy of Languages
Initializing live version

Requires a Wolfram Notebook System
Interact on desktop, mobile and cloud with the free Wolfram Player or other Wolfram Language products.
Shannon's information entropy [1] is defined by , where
is the probability of
, and the sum is over the elements of
. Shannon's entropy is a measure of uncertainty.
Contributed by: Hector Zenil and Elena Villarreal (September 2012)
Open content licensed under CC BY-NC-SA
Snapshots
Details
Reference
[1] C. E. Shannon, "Prediction and Entropy of Printed English," Bell Systems Technical Journal, 30, 1951 pp. 50–64. www.ics.uci.edu/~fowlkes/class/cs177/shannon_51.pdf.
Permanent Citation