Murder game


When playing the murder game with colleagues, what are the odds you have to kill all of them to win? Let's do the math.

During our company retreat, we played a game with the name "moordspel", or "murder game". Everyone wrote their name on a card and put it in a hat. After that, everyone drew one card out of the hat. This would be their victim. When someone was alone in the room with their victim, they would "murder" them (don't worry, we're all fine), and get the card of their victim. The person on that card would be their new victim. The game only ends when there is only one person left.

Or does it?

It is possible for someone to murder people until they get their own card, only to find out there are still other people alive. A very easy example occurs when there are four people playing, with the following draw:

  • A drew B
  • B drew A
  • C drew D
  • D drew C

With this draw, when A and C have committed their murder, the game will be over with more than one person left. However, with another draw, there will always be one person left. For example:

  • A drew B
  • B drew C
  • C drew D
  • D drew A

We can represent these draws by directed graphs, drawing arrows from murderers to victims. The first draw becomes:
Murderers error

While the second becomes
Murderers circle

With these, it is clear that the chance of someone ending the game as the only one left depends on the draw, and not what happens after. To see the "path" a murderer can take, just follow the arrows, and eventually, we will end up where we began. The question is whether all players are part of the same "cycle".

So, what is the chance that everyone is part of the same cycle?

Let's assume we play with $n$ players, and go step by step. If we get one cycle everyone is part of, we will call it a success, otherwise, a fail.
Person A is the first one to draw a card. After this, are we still ok? We are if A did not draw their own card, so in this step, there is an ${n-1} \over n$ chance we can move on without failing. We will call the person on A's card B.
Since it does not really matter in what order we draw the cards, let's say A's victim, B, is the next one to draw a card. After this, are we still ok? Well, we don't have to worry about B drawing their own card (since it is already drawn by A), but it will go wrong if B draws A. So, of the remaining $n-1$ cards, B can draw $n-2$ without problems. We have an ${n-2} \over {n-3}$ chance of not-failing in this step.
With the same logic, B's victim C can draw $n-3$ cards out of $n-2$, and so on.
This goes on until we arrive at the second-to last person, who has a $1 \over 2$ chance of going correctly. The last person has only one card to draw from, and if everything went correctly, this card will say "A".
So, multiplying this, the probability of everything going correctly is
$${{{n-1} \over n} * {{n-2} \over {n-1}} * {{n-3} \over {n-2}} * ... * {2 \over 3} * {1 \over 2}} = {(n-1)! \over n!} = {1 \over n}$$

Another way of looking at this is: after A drew his card, with a chance of ${n-1} \over {n}$ of going correctly, there is a chance of $1 \over {n-1}$ for A's card to be drawn last, and if (and only if) this happens, we have a full cycle. So, we have a chance of ${{n-1} \over {n}} * {{1} \over {n-1}} = {1 \over n}$.

However, it gets more complicated.

Of course, a situation in which one of the players has their own card is not desirable, since this will be a less than interesting game for them. So, we adopt a new rule: if someone draws their own card, they will speak up immediately and we will start the whole process from the beginning. How does this change the chance of a full cycle?

At first, this seems simple. Following the above steps, the only step where this really makes a difference is the first step. A can draw their own card, but B can't, since their card is already out of the hat: A has it. The same goes for C, D, etc. So, can we just take the first term out of the formula? This would give a $1 \over {n-1}$ chance with the new rules.

Actually, it is not that simple.

A quick example with three players immediately shows there is something wrong here. Without the possibility of someone drawing their own card, a full cycle is the only outcome with three players, so there is a chance of 1. However, evaluating our new formula $1 \over {n-1}$ for $n=3$ gives a chance of $1 \over 2$. What is wrong here?

Well, with the new rules, there are not two, but three kinds of draws:

  1. full cycle
  2. not full cycle, and no one has their own card
  3. someone has their own card, we have to start over

Let's call the probabilities of ending up in these piles $P_1$, $P_2$ and $P_3$ respectively. We know that $P_1+P_2+P_3=1$. As said before, ${P_1}={1 \over n}$. We are interested in $P_1 \over {P_1+P_2}$.

Now let's look back at our step-by-step process again, and let's say someone other than the last person draws A's card, thereby closing the cycle too soon. Does this draw end up in pile 2? Not necessarily. There are still some players left who have not drawn a card, and one of them might still draw their own, sending this draw to pile 3, where it will not count for our calculation. If there are $m$ players left (so we have an $(n-m)$-cycle), what is the probability of this one ending up in pile 3? This probability is the probability of someone drawing their own card in a draw of m players, and this probability depends on $m$. Let's call this probability $P_{sd}(m)$.

It is easy to derive that:
$P_{sd}(1) = 1$
$P_{sd}(2) = 1/2$
$P_{sd}(3) = 2/3$
From there, it gets more complicated, but this number converges to $1-{1 \over e}$ (approx. 0.632) as $m$ goes to infinity. (source: Wikipedia)
This means that, even when the step-by-step process fails before the end, there is still a chance of approximately $2 \over 3$ (depending on when it fails exactly) that this draw will not count and the process needs to be repeated.
So, $P_2$ is not $n-1 \over n$, but rather approximately $2 \over 3$ smaller than that, meaning $P_1 \over {P_1+P_2}$ approximately triples in size for large $n$ (or rather, gets multiplied by $e$). But, can we find an exact expression for this probability?

When you are not smart enough to figure out the right calculation by yourself, there are only two things left to do. One of them: Monte Carlo simulation. A Monte Carlo simulation confirms the hypothesis: if self-draws are allowed, the results give a full cycle approximately $1 \over n$ of the times, and if self-draws are not allowed, this probability seems to approximately triple. The Monte Carlo does not give an exact expression though, and we still want that. So, we do the other thing left: look it up on Wikipedia.

Permutations of the sequence 1,...,$n$ with no fixed points (iow: draws in which no one has their own card) are called derangements. $P_1+P_2$ is equal to $!n \over n!$, in which $!n$ is the number of derangements for $n$, and $n!$ is n factorial (or $1*2*...*n$). The formula for $!n$ is actually quite complicated (see Wikipedia), but for people who like inelegant solutions, it is also given by the formula $!n = {round({{n!} \over {e}})}$.

So, now we have the formula we want: for $n$ people, the chance of a full cycle is given by:
$${{{1/n} \over {{!n}/{n!}}}} = {{1/n} \over {round(n!/e) / n!}} = {{(n-1)!} \over {round(n!/e)}}$$
Which is approximately equal to $e \over n$. The Monte Carlo simulation confirms this.

In our case, there were 18 people, which gives us a probability of 0.151 of getting a full cycle. We did not. We turned out to have three cycles. What are the odds, right? At least we can say that murdering people puts the fun into fundamental mathematics.

Calculate the probability yourself for different amounts of players:

Code to run the Monte Carlo simulation yourself in R

Jan-Willem Tel

Auteur: Jan-Willem

Functie: Data Scientist

Terug