Boltzmann’s entropy formula is possibly one of the most difficult equations in Physics. Not because the equation itself is that confusing (it isn’t, it is just two variables one constant and a trig function), but because it relates two things, entropy, and probability of being indifferent energy states, that are both difficult to really understand. So, as I usually do, I looked into the history of this law, who made it and why was it made? And I found that it was created by a scientist named Max Planck in 1900 and written almost identically to the modern form in 1901. But wait, why is it called the Boltzmann equation and why is the constant in it Boltzmann’s constant and why is it even on Boltzmann’s freaking gravestone if Planck created it? Good questions. Ready for the answers? Let’s go!
I would like to start with the origin of the idea of entropy. In 1854, a German scientist named Rudolf Clausius noted that absorbing less heat at a lower temperature was equivalent to absorbing more heat at a higher temperature so he called the heat over the temperature the “equivalence value” (he would later call the equivalence value the entropy). By 1862, Clausius found that an increase in the equivalence values of a single process would increase the separation of the molecules or disorganize their relationship. He also found that any decrease in the “equivalence function” on one object would necessitate an increase of equal or greater value in other objects. In other words, the “equivalence value” (entropy) of a closed system can only increase. Finally, in 1865, Clausius renamed the “equivalence value” the entropy and gave it the letter S, and defined the second law of thermodynamics to be the entropy of the universe can only increase, which is a modern definition.
Now I want to take a moment to talk about the history of probability in thermodynamics and I am going to also start with… ready for it… a German man named Rudolf Clausius! In 1857, three years after Clausius’s first paper introducing the “equivalence value”, Clausius wrote a paper on what the temperature meant about the motion of molecules1. In this paper, Clausius became the first person to include the rotational and vibrational motion of molecules as well as linear motion. Even with these limitations, Clausius found that molecules move at very fast speeds. For example, Clausius found that hydrogen gas at 0 Celsius should move at slightly more than 5 times the speed of sound! After reading Clausius another scientist published an objection if gas molecules are really moving so fast how come cigar smoke doesn’t fill the room faster than the speed of sound? Clausius felt like this was an interesting objection (writing he “rejoice[s] at the discussion2”), but instead of invalidating his theories, Clausius decided that it would all work out if the molecules are moving very fast but not very far. In other words, in cigar smoke (or in any gas) there are a ridiculous number of molecules moving in all directions but they don’t get very far before bouncing off of another molecule and changing direction so that even though the individual gas molecules are moving very quickly, the gas itself diffuses slowly. He then came up with a term he called the “mean length of the path” which was “how far on an average can the molecule move before its center of gravity comes into the sphere of action of another molecule3.” Conveniently an English scientist named Frederick Guthrie was a fan of Clausius and was also bilingual (he received his Ph.D. in Germany with one of my favorite scientists Robert Bunsen) and translated Clausius’s work into English and published it in February of 1859.
Three months later a 27-year-old Scottish scientist named James Clerk Maxwell wrote a friend that Clausius’s paper had inspired him to determine the mean path by, “comparing it with phenomena which seem to depend on this “mean path” …[like] internal friction of gases, diffusion of gases, and conduction of heat through a gas4”. Between 1860 and 1866, Maxwell produced a series of papers on what he called “the Dynamical Theory of Gases” although he admitted it was, “Professor Clausius, to whom we owe the most extensive developments of the dynamical theory of gases5.” During the same time, the busy Maxwell also published a couple of papers on equations of electricity and magnetism the results of which are known as Maxwell’s laws! Funny story, in 1861, Maxwell was once stuck in a crowd and Michael Faraday saw him and, referring to his work in statistics, shouted, “Ho, Maxwell, cannot you get out? If any man can find his way through a crowd it should be you6.”
Clausius had some minor issues with Maxwell’s theories but there was another German scientist who was entranced, and his name was Ludwig Boltzmann (yes, he is involved). Boltzmann was 13 years younger than Maxwell and went to college in 1863. While still an undergraduate one of his professors instantly realized his brilliance and, according to Boltzmann, he immediately decided, “to hand me a copy of Maxwell’s papers [on gases]” however, “at that time I did not understand a word of English, [so] he also gave me an English grammar [book]7.” Boltzmann managed to translate Maxwell and started to publish his own papers on gasses. Three years later, in 1866, Boltzmann had earned his Ph.D. in the kinetic theory of gasses and by 1869 25-year-old Boltzmann was made a full professor! Maxwell and Boltzmann’s work on statistics in thermodynamics led to the Maxwell-Boltzmann distribution or equations for the probability of different speeds of gasses that are still used today. Aside from his eerie intelligence Boltzmann was also known for his excellent teaching prowess, (Lisa Meitner described him as the best teacher she had ever had), his humor, (like when he said he was “tolerably” good at mathematics except “when I am counting beer glasses”), and his oddly naive view of reality (like when he consulted a professor of zoology to determine how to milk a cow).
Maxwell started being confused about entropy, however, once Maxwell’s confusion was cleared up (by an American named Josiah Gibbs), Maxwell started to combine his theories of using statistics to study molecules with entropy. This led Maxwell to decide that, “the truth of the second law is, therefore, a statistical, not a mathematical truth, for it depends on the fact that the bodies we deal with consist of millions of molecules, and that we never can get hold of single molecules.” Or, as Maxwell amusingly put it to a friend, “The second law of Thermodynamics has the same degree of truth as the statement that if you throw a cup of water into the sea, you cannot get the same cup of water out again8.” Not surprisingly, Clausius was not pleased with Maxwell for downgrading “his” second law like that. However, Clausius had other issues: first in 1870, Clausius had been injured in the Franco-Prussian war when he organized an ambulance corps. Second, in 1875, Clausius’s wife died in childbirth and Clausius took time off of his research to focus on raising his six children and teaching (Clausius always taught, even on his death bed).
Young Boltzmann, however, was motivated once again by Maxwell and now dived into the relationship between statistics and entropy. In 1872, Boltzmann wrote that, “the molecules of [a] body are indeed so numerous, and their motion is so rapid, that we can perceive nothing more than average values… Hence, the problems of the mechanical theory of heat are also problems of probability theory9.” By 1877, Boltzmann started to work out the relationship between probability and entropy. He started with the idea that “in the game of Lotto any individual set of five numbers is just as improbable as the set 1,2,3,4,5. It is only because there are many more uniform distributions than non-uniform ones that the distribution of states will become uniform in the course of time10.” In other words, entropy increases because there are “infinitely many more uniform than non-uniform distributions11”. Boltzmann even declared that one “could even calculate, from the relative numbers of the different state distributions, their probabilities12.” So, in October of 1877 Boltzmann set out to find the actual equation for the different state distributions to determine the entropy from the number of arrangements of molecules. He ended up writing over 50 pages of dense, equation-rich material where he formulated how one could determine the probability of having molecules in different states. Boltzmann broke up the energies of molecules into different sections and predicted the probability of ending up with different scenarios. He wrote, “The initial state will, in most cases, be a very unlikely one, and the system will move from it to more and more probable states until at last, it becomes the most probable so that the heat equilibrium has reached. Applying this to the second law, we can identify the quantity which is usually called the entropy, with the probability of the condition in question13.” However, Boltzmann did not make an equation between entropy and his newly defined probability.