Flippin' coins...and quantum mechanics

A measurement of the $z$-component of the spin of an electron always returns one of two values: spin up or spin down.

Tossing coins

Let a coin represent a particle:

  • Its two possible states after a toss are heads "H" and tails "T".
  • These can represent the possible states of an electron.
  • In a uniform magnetic field, the two spin states each have a different energy. Call these $E_H$ and $E_T$.
  • The Energy of a collection of $N$ electrons (coins) is $$E=N_h E_H+N_T E_T$$ where $N_H+N_T=N$.

    A 'fair' coin has an equal probability (0.5) to be found in either state. (This might *not* be true of electron spins, if there is some external field present.)

    Even with many, many coins, there are only two energy levels available for any individual particle, and these energy levels are the same for each particle. We can indicate the two single-coin states as $H$ and $T$ or $1$ and $2$.

    Therefore, knowing how many heads, $N_H=N_1$, and how many tails, $N_T=N_2$ are present is enough to uniquely specify the macrostate (state with a unique total energy) of the system. [We don't need to know exactly which coins are heads and which are tails.]

    We can write the arrangement of particles in the two levels as $\left{N_H, N_T\right}$, or number the energy levels and write $\left{N_1, N_2\right}$. As we flip more and more coins, there are more and more possible ways of arranging heads and tails....

    $N=1$

    {1,0}, {0,1} : 2 macrostates

    $N=2$

    {2,0}, {1,1}, {0,2}: 3 macrostates

    $N=3$

    {3,0}, {2,1},{1,2},{0,3}: 4

    ...

    For $N=2$ coins, number the macrostates with $k$...

    macrostate microstate probability
    $k$ $\left{N_H,N_T\right}$ Coin 1 Coin 2 $w_k$ $P_k$
    1 {2,0} H H 1 1/4
    2 {1,1} H
    T
    T
    H
    2 2/4
    3 {0,2} T T 1 1/4

    For 2 coins the total number of microstates is 4.




    Relative probability

    The relative probabilities of different macrostates, indexed by $k$ is equivalent to:

    $w_k \equiv$ the number of ways of arranging particles to get macrostate $k$.

    Also called the thermodynamic probability, these $w_k$ do not need to add up to 1.

    The total number of microstates is... $$\Omega = \sum_k w_k.$$

    Since all the microstates are equally likely, the probability of a particular macrostate $k$ is: $$P_k = \frac{w_k}{\Omega}.$$

    We can calculate something like the average number of heads. Let $N_{jk}$ be the number of particles in the state $j$ (e.g. $j=1$ means heads, $j=2$ means tails) in macrostate $k$. Then... $$\overline{N_j} = \sum_k N_{jk}p_k=\frac{\sum_k N_{jk}w_k}{\Omega}.$$

    So for 2 coins, the average number of heads is... $$\overline{N_1} = \left( 2\times 1+1\times 2+0\times 1\right)/4 = 1.$$

    And, in general, we'd guess that $\overline{N_1}=N/2$.

    To do for N=6 coins:

    1. What is the total number of macrostates?

    2. Can you generalize this? What's the total number of macrostates for $N$ coins?

    3. What's the total possible number of microstates (arrangements) of the six coins? (This is $\Omega$.)

    4. List all the microstates for the macrostate with 2 heads (and 4 tails).

    5. Count up the microstates--this is $w_k$--and then figure out $P_k$ for this macrostate.

    6. Hmmm, It would be useful to have a general formula for the number of microstates for a given number $N_1$ of heads given a total number of coins $N$... That's what we'll do next. Do come back here and check that formula for your result with $N_1=2$ and $N=6$.

     

    General formula for $w_k$

    We'd like a general formula for $w_k$, the number of microstates for the macrostate $k$ that has $N_1$ heads and $N_2$ tails. That is $\left{N_1,N_2\right}=\left{N_1, (N-N_1)\right}$.

    E.g., concretely, how many ways are there of getting 4 heads when throwing 10 coins?

    Let us consider listing off the index numbers of the coins that were heads. E.g. for the microstate H,T,H,T,H,H,T,T,T,T, we list the coins which were heads as: $\left{1,3,5,6\right}$.

    We should like to count up how many different combinations of 4 numbers we can make out of 10 different numbers: $$\begineq \left{1,3,5,6\right}\text{ and } \left{3,1,5,6\right} &\text{ : }& \text{same combination}\\ \left{1,3,5,6\right}\text{ and } \left{1,3,5,7\right} &\text{ : }& \text{different combination}\endeq$$

    1. It's not what we want, but we *can* count how many ordered sequences of 4 numbers we can make out of 10 different numbers:

      There are 10 choices for the first number in the sequence, then 9 for the second, such that for 4 numbers we have $$10*9*8*7$$ different ordered sequences.

    2. Generalizing to ordered sequences of length $N_1$ chosen from $N$ different numbers: $$\begineq N(N-1)...(N-N_1+1) & =&\\ \frac{N(N-1)...(N-N_1+1)(N-N_1)...(2)(1)}{(N-N_1)...(2)(1)}&=\\ &=\frac{N!}{(N-N_1)!}&.\endeq$$

    3. But some these sequences have the same coins coming up heads in them, just in a different order. E.g. {1,3,5,6} and {3,1,5,6}. In fact, the number of ways of arranging 4 different numbers in sequences is 4 for the first number, 3 for the second, 2 for the third, and then just 1 choice left for the fourth, so $$4*3*2*1=4!$$ So for any particular ordered sequence of $N_1$ numbers, there are $N_1!$ ways of arranging the numbers in that sequence.

    4. So to get the total number of unique combinations, we'd like: $$\begineq\frac{\text{# of ordered sequences of 4}}{\text{# of ways of ordering 4 numbers}}&=\\ \frac{\frac{N!}{(N-N_1)!}}{N_1!}&=\frac{N!}{N_1!(N-N_1)!} \equiv {N \choose N_1}.\endeq$$

    This number is the binomial coefficient from probability theory.

    Binomial coefficient: The number of ways of choosing $N_1$ objects from a collection of $N$ distinguishable objects. $$w_k = \frac{N!}{N_1!(N-N_1)!}\equiv {N \choose N_1}.$$

    You may hear as shorthand for this: "$N$ choose $N_1$".

    For $N$ coins, the macrostate with the most microstates is the one with equal numbers of heads and tails. So, the maximum number of microstates in any macrostate is $$w_{max}=\frac{N!}{(N/2)!(N/2)!}.$$

    Image credits

    Michael Diderich, Charles Cowley, Theresa Knott