Suppose we have two independent random variables $X$ (with distribution $p_X$) and $Y$ (with distribution $p_Y$) which take values in the cyclic group $\mathbb{Z}_n$. Let $Z = X +Y$, where the addition is done modulo $n$. Distribution of $Z$ is given by $$p_Z = p_X \circledast p_Y$$ where $\circledast$ stands for cyclic convolution. Suppose we were interested in minimising $H(Z)$ while keeping $H(X)$ and $H(Y)$ fixed. I also impose an additional constraint that $p_X(0) > 0$ and $p_Y(0) > 0$. The conjecture is as follows-
The $p_X$ and $p_Y$ which minimise $H(Z)$ have to be supported on the smallest possible subgroup $G$ of $Z_n$ that satisfies $$\log_2(|G|) \geq \max(H(X),H(Y))$$
Some Remarks:
The constraint that $p_X(0) > 0$ and $p_Y(0) > 0$ is essentially to rule out $X$ and $Y$ being supported on some coset of $G$. One can even assume $p_X(0)$ and $p_Y(0)$ are the maximum probabilities in the distributions $p_X$ and $p_Y$ since we can always "rotate" the distributions and still keep $H(Z)$ the same.
The intuition behind this is that we have the trivial lower bound $$H(Z) \geq \max(H(X),H(Y)$$ and when $H(Y) \leq H(X) = \log_2 |G|$, this lower bound is actually attained. Also intuitively the more concentrated a distribution, the lesser its entropy. So forcing $X$ and $Y$ to be supported in $G$ also ensures $Z$ is supported only on $G$, which is "small".
I'm not sure but I think one can also pose this as a norm-extremisation problem instead of entropy by using the fact $$H(p_X) = \lim_{p \to 1}\frac{1}{1-p}\log_2 \sum p_X(i)^p$$
I unfortunately have very little empirical evidence. An exhaustive search of all possible "entropy fixing" distributions of $X$ and $Y$ goes out of hand very fast. My codes run only till $n=5$ in reasonable time, and $4$ being the only composite number till then, I've verified it only for $4$. Since things are much more cleaner for $2^n$ than for an arbitrary composite number, maybe its true only for $\mathbb{Z}_{2^n}$.