$n$ players indexed $1,2,...,n$ play a game of mock duel. The rules are simple: starting from player $1$, each player takes turns to act in the order $1,2,...,n,1,2,...$. In his turn, a player randomly chooses one of the other remaining players as target, and fires one shot at him. If hit, the target is removed from the game. Whoever survives to the last is the winner. All players have the same marksmanship $0\lt p\le 1$, meaning that their shots hit target with probability $p$.
An example: if $p=1$ and $n=3$, player $1$ randomly chooses $2$ or $3$ as his target and fires one shot. The survivor of $1$'s shot will shoot next and fire at the only other remaining player, player $1$. Hence in this case the three players have winning probabilities $0,0.5,0.5$ respectively.
Question: which player has the highest winning probability for large $n$'s?
To be more specific, let's define $B(n)$ to be the player with the highest winning probability for the $n$-players game. Are there any regularity about where $B(n)$ tends to occur in $(1,2,...,n)$? In particular:
Does $\lim_{n \to \infty} B(n)/n$ exist for any $p$?
Is $\lim_{n \to \infty} B(n)=1$ if $p$ is sufficiently small?
Edit: Clarified the rules to address possible ambiguity suggested in the comments.
If my calculation is correct, the winning probability for player $k$ in the $n$ players game, $s(n,k)$, is given by the recursive formula:
$$ \begin{align} & s(n,1)=(1-p)s(n,n)+ps(n-1,n-1)\\ & s(n,2)=(1-p)s(n,1)+\frac{n-2}{n-1}ps(n-1,1)\\ & s(n,k)=(1-p)s(n,k-1)+\frac{n-k}{n-1}ps(n-1,k-1)+\frac{k-2}{n-1}ps(n-1,k-2) \end{align} $$ for $k\gt 2$.
For large $n$'s and $p=1$, plotting $s(n,k)$ against $k$ looks like sinusoid:
If you plot the peaks of these curves (i.e. $B(n)$) against $n$, you get this:
At first glance it seems $B(n)$ just oscillates between $1$ and $n$ in a self-similar fashion. But actually the hill tips wear off further and further away from the diagonal line as $n$ increases and the valleys also grow flatter:
One can wonder whether the hills will eventually fall to small bumps or be flattened relative to the diagonal line, meaning that $\lim_{n \to \infty} B(n)/n=0$.
On the other hand, $B(n)=1$ becomes more often as $p$ gets smaller. Blue curves are $B(n)$ for $p=0.5,0.48,0.46$ respectively, for $n$ up to 5000 (larger $n$'s take exponentially more time to compute):
Can we claim that $\lim_{n \to \infty} B(n)=1$ for small $p$? Or is there always exceptions?