The entropy of a list is defined by , summing over the elements of . and are the probabilities of black and white cells, respectively. The initial condition is a finite list of random bits.

The entropy can be used to study the amount of information in the evolution of a cellular automaton; it is lower in ordered systems and higher in chaotic (disordered) systems.