Let us $a_{ij}$ be the elements of a $n$-dimensional covariance matrix. Can we prove the following?
$$ 1-\sum_{k=1}^n a_{ik} \lambda_k + \sum_{j=1}^n \sum_{k=1}^n \lambda_j a_{jk} \lambda_k > 0, \qquad i \in \{1, \dots, n\} $$
where the $\lambda_k$ are constrained by
$$ \sum_{k=1}^n \lambda_k = 1, \quad \lambda_i > 0 \text { for } i \in \{1, \dots, n\} $$
Or, in a more general way, what are the conditions that the elements of the covariance matrix should satisfy so that the above set of inequalities hold?
NOTE: in matrix form, if $A$ is a covariance matrix and $a_i$ is a row vector having the i-th row of $A$, the question is:
is $1-a_i \lambda + \lambda^T A \lambda >0 $ where $\lambda=[\lambda_1 \ldots \lambda_n]^T$ and $\lambda^T e_n $ =1 with $\lambda_k >0$ and $e_n$ a column vector of ones. The question should be formulated as: find the conditions for the elements of matrix $A$ so that the inequality holds for $i=1,\ldots n$.