I’m reading a very interesting book written by Seth Lloyd called “Programming the Universe: a quantum computer scientist takes on the Cosmos“. Highly recommended! I enjoyed in particular how entropy is compared to a spreading disease: a disease of ignorance. What is it all about? Let me explain briefly. From statistical mechanics, entropy is a measure of the states in which a system (a gas, for instance) can be found. Instead, in information theory, entropy is a measure of the uncertainty of a random variable. The link between the two? Entropy is a measure of our ignorance about a physical system (so is entropy an “anthropic” quantity?). The Second law of thermodynamics (entropy does not decrease, on average) can be explained by the mean of the famous H-theorem due to Boltzmann. The theorem proves the tendency to increase (on average) of a quantity (H) that can be interpreted as the entropy of a system. What happens is that a system, even if governed by reversible laws such as Newtonian mechanics, evolves towards a state that minimizes the H function. At the minimum, the velocities of the constituents are distributed according to the Maxwell-Boltzmann formula. As the constituents interacts among each other e.g. via scattering, ignorance about their initial state (position and velocity) spreads until the minimum is reached. In principle, the process can be reversed (Loschmidt’s paradox) but only one experimental case is known to work (spin echo).
This having said, I though that a very simple 1-D model can show the increase of entropy. First of all, the information entropy for a string of N bits is given by the formula:
where the sum runs over the i bins, and p_i is the probability distribution of each bin. If all the bins are independent of each other, p_i = 0.5 (either 0 or 1). Which is the entropy of a bit whose content is certain? Zero! There is no uncertainty on something that we know.
The simple model is inspired by cellular automata and is built as follows: created a string of N bits, initialize it with random numbers (0 or 1). At each iteration n, interactions between two adjacent bits i and j are represented by the logical AND: bit[n+1][i] = bit[n][i] && bit[n][i]. If all the bits are known from the start, entropy is always zero. How can we “mask” a bit? I implemented our ignorance by using floating-point NaN (not a number). Even if logical operations between an INT and a FLOAT are ill-defined, let’s say that whenever one of the two is a NaN, the outcome is a NaN. After all, how do you calculate the result of a binary operation if you don’t know the value of one of the two operands?? This way, each time an INT and a NaN get adjacent, the resulting bit becomes a NaN so “ignorance spreads”.
This is the visual evolution for 10 bits and 10 iterations:
bit string entropy
[ 0. 1. 1. 1. nan 1. 1. 0. 1. 0.] 0.0 [ 0. 0. 1. 1. nan nan 1. 0. 0. 0.] 1.0 [ 0. 0. 0. 1. nan nan nan 0. 0. 0.] 1.5 [ 0. 0. 0. 0. nan nan nan nan 0. 0.] 2.0 [ 0. 0. 0. 0. nan nan nan nan nan 0.] 2.5 [ 0. 0. 0. 0. nan nan nan nan nan nan] 3.0 [ nan 0. 0. 0. nan nan nan nan nan nan] 3.5 [ nan nan 0. 0. nan nan nan nan nan nan] 4.0 [ nan nan nan 0. nan nan nan nan nan nan] 4.5 [ nan nan nan nan nan nan nan nan nan nan] 5.0 [ nan nan nan nan nan nan nan nan nan nan] 5.0 [ nan nan nan nan nan nan nan nan nan nan] 5.0
As you can see, information entropy increases (in this case, linearly) then reaches a flat top. What does it mean? I think that the explanation is straightforward: the system evolves towards a state of maximum entropy. When all the bits are NaNs, entropy cannot increase any further. When thermodynamics is applied to the Universe, this state is called the “heat death“.
I also made a run with 200 bits and 300 iterations. See the result below.
A final consideration is needed. This simple model lacks a fundamental property: it is non invertible, i.e. from the step n+1 one can not recover step n. It would be interesting to repeat the exercise using an invertible cellular automaton. Reversible CAs were invented by Toffoli and Margolus in the early 80s. Lattice gases are examples of these.
Download the python program from this link.