Let $A$ be a $k \times n$ orthogonal matrix; i.e., $AA^T = I_{k \times k}$. For $1 \leq j \leq n$, let the squared norm of the $j$-th column of $A$ be denoted by $\alpha_j^2$; i.e., $$\sum_{i=1}^k a_{ij}^2 = \alpha_j^2.$$ Let $\Lambda = \operatorname{Diag}(\lambda_1, \lambda_2, \dotsc, \lambda_n)$ be a positive definite matrix. Define a function $F$ from the space of p.d. diagonal matrices to $\mathbb R$ as follows: $$ F(\Lambda) = \log \lvert A \Lambda A^T\rvert-\sum_{j=1}^n \alpha_j^2 \log \lambda_j.$$
To show: $F(\Lambda) \geq 0$, with equality only when $\Lambda$ is a multiple of identity.
Background: This result is true and follows from an inequality known in information theory as Zamir and Feder's entropy power inequality. To obtain the result, we may apply Zamir and Feder's inequality to Gaussian random variables with covariances $\lambda_i$. (A statement of the inequality may be found here: Rioul - Information-theoretic proofs of entropy power inequalities, Proposition 9, (65c).)
I am interested in knowing if there is an alternate proof of this result that does not rely on entropy inequalities, and uses linear algebraic tools instead.