The following post is a result of a discussion with Imre Ruzsa. Motivated by the following easy inequality in additive combinatorics
I asked if the following was true for a finitely valued random variable :
Here all sums are of independent copies of the random variables. The idea is that one might expect to be a bit more uniform than .
First Imre provided a counterexample to the question
I find this example is particularly elegant. Let be uniform on and be uniform on . Then is uniform on , while the support of is but is not uniform (there is concentration in the middle thanks to the distribution of ).
We then seriously questioned the validity (1). After some discussion, Imre eventually said something about higher dimensional concentration that made me think one should check (1) for the “Gaussian.” The reason Gaussian is in quotes is that it is not finitely valued as assumed in (1), so strictly speaking we cannot check it for the Gaussian. To see if there was hope, I looked at the differential entropy of a real valued random variable with density defined via
Let us take to be the Gaussian with mean zero (this is irrelevant for entropy) and variance 1. Recall some basic properties of variance:
where and is understood to be the sum of two independent copies of . Thus
So we indeed see that (1) is not true for the Gaussian. To construct a finitely valued random variable that does not satisfy (1), we can convolve a Bernoulli random variable with itself until (1) is not satisfied (assuming that going from discrete to continuous does not destroy (1) which is not obvious without checking as has a strange support condition, for instance the same argument would prove which is clearly not true for discrete random variables). Anyways, I wrote some quick python code to check this and found that for where is the random variable of a fair coin flip, we have
Here and are supported on and so their entropies are bounded by the entropy of uniform distribution on 10 elements which is