Search Results
Search type | Search syntax |
---|---|
Tags | [tag] |
Exact | "words here" |
Author |
user:1234 user:me (yours) |
Score |
score:3 (3+) score:0 (none) |
Answers |
answers:3 (3+) answers:0 (none) isaccepted:yes hasaccepted:no inquestion:1234 |
Views | views:250 |
Code | code:"if (foo != bar)" |
Sections |
title:apples body:"apples oranges" |
URL | url:"*.example.com" |
Saves | in:saves |
Status |
closed:yes duplicate:no migrated:no wiki:no |
Types |
is:question is:answer |
Exclude |
-[tag] -apples |
For more details on advanced search visit our help page |
4
votes
Accepted
Convergence of Markov chains in terms of relative entropy
Turns out $D(p_t||\pi)$ need not always be convex. The following paper demonstrates a counter-example in section $4.2$- http://arxiv.org/abs/0712.2578
6
votes
1
answer
1k
views
Convergence of Markov chains in terms of relative entropy
Suppose the chain starts with the initial distribution $p$ at time $0$, then at time $t$ the distribution is given by $$p_t = p*e^{Qt}$$
The relative entropy of $p_t$ with respect to the $\pi$ (also known …
12
votes
2
answers
2k
views
Proving a messy inequality
Let $H(x) = -x\log(x) - (1-x)\log(1-x)$ be the binary entropy function. …
5
votes
Accepted
Proving a messy inequality
I think I managed to prove the entire inequality analytically. The whole proof is a bit long to post here (about 7 pages) and involves ugly looking expressions. I'll outline the general strategy I use …
9
votes
2
answers
455
views
Entropy conjecture for distributions over $\mathbb{Z}_n$
Also intuitively the more concentrated a distribution, the lesser its entropy. So forcing $X$ and $Y$ to be supported in $G$ also ensures $Z$ is supported only on $G$, which is "small". … An exhaustive search of all possible "entropy fixing" distributions of $X$ and $Y$ goes out of hand very fast. …
2
votes
Accepted
Entropy conjecture for distributions over $\mathbb{Z}_n$
But consider $$p_X = [0.5, 0, 0, 0.5, 0, 0]$$ and $$p_Y = [0.5 - \delta, \frac{\delta}{2}, \frac{\delta}{2},0.5 - \delta, \frac{\delta}{2}, \frac{\delta}{2}]$$ for a suitable $\delta$ which adjusts the entropy …