Using the second law of thermodynamics, it is possible to use random variables to calculate the information entropy (or Shannon entropy) of a message, which is a measure of the amount of information in the message. The probabilities that and occur in the message are and .

is the entropy of a message . The random message is generated with the symbols and equally likely, and the message's entropy is computed assuming the value of from the slider.