Prediction and Entropy of Languages

Shannon's information entropy [1] is defined by , where is the probability of , and the sum is over the elements of . Shannon's entropy is a measure of uncertainty.
An application is the redundancy in a language relating to the frequencies of letter -grams (letters, pairs of letters, triplets, etc.). This Demonstration shows the frequency of -grams calculated from the United Nations' Universal Declaration of Human Rights in 20 languages and illustrates the entropy rate calculated from these -gram frequency distributions. The entropy of a language is an estimation of the probabilistic information content of each letter in that language and so is also a measure of its predictability and redundancy.


  • [Snapshot]
  • [Snapshot]
  • [Snapshot]
  • [Snapshot]


[1] C. E. Shannon, "Prediction and Entropy of Printed English," Bell Systems Technical Journal, 30, 1951 pp. 50–64.
    • Share:

Embed Interactive Demonstration New!

Just copy and paste this snippet of JavaScript code into your website or blog to put the live Demonstration on your site. More details »

Files require Wolfram CDF Player or Mathematica.