9
$\begingroup$

Suppose we have two independent random variables $X$ (with distribution $p_X$) and $Y$ (with distribution $p_Y$) which take values in the cyclic group $\mathbb{Z}_n$. Let $Z = X +Y$, where the addition is done modulo $n$. Distribution of $Z$ is given by $$p_Z = p_X \circledast p_Y$$ where $\circledast$ stands for cyclic convolution. Suppose we were interested in minimising $H(Z)$ while keeping $H(X)$ and $H(Y)$ fixed. I also impose an additional constraint that $p_X(0) > 0$ and $p_Y(0) > 0$. The conjecture is as follows-

The $p_X$ and $p_Y$ which minimise $H(Z)$ have to be supported on the smallest possible subgroup $G$ of $Z_n$ that satisfies $$\log_2(|G|) \geq \max(H(X),H(Y))$$

Some Remarks:

  1. The constraint that $p_X(0) > 0$ and $p_Y(0) > 0$ is essentially to rule out $X$ and $Y$ being supported on some coset of $G$. One can even assume $p_X(0)$ and $p_Y(0)$ are the maximum probabilities in the distributions $p_X$ and $p_Y$ since we can always "rotate" the distributions and still keep $H(Z)$ the same.

  2. The intuition behind this is that we have the trivial lower bound $$H(Z) \geq \max(H(X),H(Y)$$ and when $H(Y) \leq H(X) = \log_2 |G|$, this lower bound is actually attained. Also intuitively the more concentrated a distribution, the lesser its entropy. So forcing $X$ and $Y$ to be supported in $G$ also ensures $Z$ is supported only on $G$, which is "small".

  3. I'm not sure but I think one can also pose this as a norm-extremisation problem instead of entropy by using the fact $$H(p_X) = \lim_{p \to 1}\frac{1}{1-p}\log_2 \sum p_X(i)^p$$

  4. I unfortunately have very little empirical evidence. An exhaustive search of all possible "entropy fixing" distributions of $X$ and $Y$ goes out of hand very fast. My codes run only till $n=5$ in reasonable time, and $4$ being the only composite number till then, I've verified it only for $4$. Since things are much more cleaner for $2^n$ than for an arbitrary composite number, maybe its true only for $\mathbb{Z}_{2^n}$.

$\endgroup$
2
  • $\begingroup$ I am not an expert in the field, but the recent activity in additive number theory seems directly relevant. Look up e.g. the "sum-product theorem" (Bourgain-Katz-Tao). You can find out about that and related stuff e.g. on Terry Tao's blog. $\endgroup$
    – Alex Eskin
    Mar 5, 2012 at 14:30
  • $\begingroup$ I did look at that some time ago, but they seem to deal with what happens as $n$ becomes large. $\endgroup$
    – VSJ
    Mar 5, 2012 at 17:15

2 Answers 2

2
$\begingroup$

The conjecture is wrong! It wasn't as complicated as I thought it was.

A simple counter example is over $\mathbb{Z}_6$. Consider $H(X) = 1$ and $H(y) = 1 +\epsilon$ where $\epsilon$ is very small. Now the conjecture would imply $X$ and $Y$ should both be supported on $\{0,2,4\}$ giving $H(X+Y) \approx 1.3326$. But consider $$p_X = [0.5, 0, 0, 0.5, 0, 0]$$ and $$p_Y = [0.5 - \delta, \frac{\delta}{2}, \frac{\delta}{2},0.5 - \delta, \frac{\delta}{2}, \frac{\delta}{2}]$$ for a suitable $\delta$ which adjusts the entropy to be $1+ \epsilon$. $$p_X \circledast p_Y = p_Y $$ so we have $H(X+Y) = 1 + \epsilon < 1.3326$. Thus disproved.

$\endgroup$
0
$\begingroup$

I now think that it is quite possibly true. I explored the case that $H(X)=H(Y)=1.$

When $n=2m$ is even , we should (I'd think) take $X=Y=\lbrace0,m\rbrace$ with equal weighting and then get $X+Y=Z$ with $H(Z)=1.$ I had mistakenly suggested that otherwise we should take $X=Y=\lbrace0,g\rbrace$ with equal probabilities to get $H(Z)=1.5$ However it was pointed to me that taking $X=Y=\lbrace-g,g,0\rbrace$ with probabilities $0.1135460976 , 0.1135460976 , 0.7729078048$ will yield $H(Z)=1.332599058$ in the case that $g=3$ and $n=9$ (or merely when $3g=n.$) By my calculations, that is optimal for a concentration on a subgroup of order $3$. Even when $Z$ has the 5 distinct members $\lbrace -2g,-g,0,g,2g \rbrace$, I get that $H(Z)=1.468267753$ which is a bit better than $1.5$.

In the case that $n=9$ it seems that taking $0$ with probability $0.8607538$ and each other 8 values with probability $0.017405778$ is best if we want only two values, but this gives (once calculations are done correctly....) an entropy of about 1.5917.

I still wonder what is best for $n=121$ (or some other composite $n$ with no small factors)

See the edit history for an earlier, faulty answer.)

$\endgroup$
3
  • 1
    $\begingroup$ I have a specific example to counter the intuition that one would want to take $X$ and $Y$ to be supported on $\{0,g\}$. If you take $n=9$ (odd and composite) and $H(X)=H(Y)=1$. If we support $X$ and $Y$ on $\{0,1\}$, we get $H(Z) = H(1/4 1/2 1/2) = 1.5$. However if we support $X$ and $Y$ on $\{0,3,6\}$ we can take $p_X = p_Y = (0.1135, 0.1135, 0.7730)$ to get $H(Z) = 1.3326 < 1.5$. $\endgroup$
    – VSJ
    Mar 6, 2012 at 5:37
  • 1
    $\begingroup$ I just checked your example.. say $a = 0.8607538$ and $b = 0.017405778$. I took $p = [a,b,b,b,b,b,b,b,b]$ and $q = p \circledast p$. This turned out to be $q = [0.7433, 0.0321, 0.0321, 0.0321, 0.0321, 0.0321, 0.0321, 0.0321, 0.0321 ]$ whose entropy is pretty big, $1.5917$ to be exact. I hope I'm wrong, perhaps you could recheck your answer? Thanks for your comments. $\endgroup$
    – VSJ
    Mar 6, 2012 at 9:32
  • $\begingroup$ You are correct. OK I am convinced. For what it is worth, I don't think that you can do better than that 1.332599 if you try to put values (a,b,b,c,c,c,c,c,c) on (0,3,6,1,2,4,5,7,8) (i.e. best is to put (c=0,b=0.1135,a=0.7433). $\endgroup$ Mar 6, 2012 at 10:02

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge that you have read and understand our privacy policy and code of conduct.