6
$\begingroup$

Consider a finite state, irreducible Markov chain with a rate matrix $Q$ and a stationary distribution $\pi$. Suppose the chain starts with the initial distribution $p$ at time $0$, then at time $t$ the distribution is given by $$p_t = p*e^{Qt}$$ The relative entropy of $p_t$ with respect to the $\pi$ (also known as the Kullback-Leibler divergence $D(p_t || \pi)$) is given by $$D(p_t || \pi) = \sum_i p_t(i) \log \left(\frac{p_t(i)}{\pi(i)}\right)$$ where $i$ ranges over all the states.

It seems well known that $D(p_t || \pi)$ monotonically decreases as $t$ increases. (For instance, section $2.9$ of Cover and Thomas's book here https://web.cse.msu.edu/~cse842/Papers/CoverThomas-Ch2.pdf).

My question is:

Is $D(p_t || \pi)$ a convex function of $t$?

I simulated for many random matrices using Matlab and did not find a counter example. Any references would be appreciated!

$\endgroup$

1 Answer 1

4
$\begingroup$

Turns out $D(p_t||\pi)$ need not always be convex. The following paper demonstrates a counter-example in section $4.2$- http://arxiv.org/abs/0712.2578

$\endgroup$

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge that you have read and understand our privacy policy and code of conduct.

Not the answer you're looking for? Browse other questions tagged or ask your own question.