Because there are around 7 billion humans on the planet, the identity of a random, unknown person contains just under 33 bits of entropy (two to the power of 33 is 8 billion). When we learn a new fact about a person, that fact reduces the entropy of their identity by a certain amount. There is a formula to say how much: ΔS = - log2 Pr(X=x) Where ΔS is the reduction in entropy, measured in bits,2 and Pr(X=x) is simply the probability that the fact would be true of a random person.