My, i.e. hyperbolic polynomials, approach falls a bit short of proving the conjecture: first of all, the bound from my inequality
is $$k^n G(k)^{n-k} \frac{k!}{k^k},$$
where $G(k) = (\frac{k-1}{k})^{k-1}$; this bound is integer only if $k=1, 2, n$. So it can't be the minimum of permanents of integer matrices.
Nobody knows the exact value of the minimum for given (k,n), I have not even seen a conjecture on that. This is why sparse problem is so much more interesting than, say,
the van der Waerden Conjecture. More seriously, my approach actually
needs the degrees of variables $x_i$ in the polynomials $(\partial_n....\partial_{i+1}) \prod_{1 \leq i \leq n}\sum_{1 \leq j \leq n} A(i,j) x_j$. There is a simple upper bound
on those degree in terms of the sparsity, but it is not sharp.
Actually $k^n (G(k)^{n-k} \frac{k!}{k^k}$ is attainable in this general setting, see my last paper in ECCC.
Now, back to Ryzer conjecture: Let A be $n \times n$ minimizer. And
$a(n)$ be the number of its boolean rows (the same with columns).
It follows from my approach + plus the known upper bound
due to Schrijver that $$\lim_{n \rightarrow \infty} \frac{\min_{over minimizers}(a(n))}{n} = 1.$$
BTW, the same applies to mixed discriminants.
Important new improvement(amazingly suggested by the Belief Propagation/ Bethe Approximation): $per(P) \geq \prod_{i,j} (1-p(i,j))^{1-p(i,j)}$, wher $P$ is doubly-stochastic http://arxiv.org/abs/1106.2844. Let $A$ be integer matrix with row and column sums equal k, $n(l)$ be the number of entries of $A$ equal $l$. Applying this new (entropic) lower bound gives the following lower bound:
$Per(A) \geq k^n \prod_{1 \leq l \leq k} (\frac{k-l}{k})^{\frac{(k-l) n(l)}{k}}$.