Mark B. Schneider
Over the past 12 years, I have developed and taught at Grinnell a two-semester general physics course that has attempted to merge the techniques of Priscilla Laws’ Workshop Physics1 (no lecture, hands-on, discovery-based group learning) with the goals of the Introductory University Physics Project2 (a more contemporary and manageably sized set of topics). The result has been a full set of activity guides that include a number of modern physics topics. The first semester includes roughly a third of the semester devoted to quantum and statistical physics. This material is approached in a theoretical manner, rather than the historical approach (e.g. Bohr atom) usually used for quantum topics or the phenomenological approach (e.g. thermal expansion) typically used for thermal physics.
At Grinnell, we do not teach an algebra based physics course. A couple of decades ago, we dropped the algebra based course when it was discovered that even among the dwindling enrollment in that class, fewer than a handful of students failed to have the minimal calculus co-requisite. I don’t think Grinnell’s faculty is unusual in being pleased with that change; algebra based physics is forced to rely on ad hoc introduction of too many under-motivated formulas, whereas calculus based physics has the power to explain many complex phenomena with a few simple principles. I see the goal of our approach to statistical physics in a similar vein: present a few reasonable fundamental principles that allow us to derive a range of concrete and practical results. I also hope to surprise students that the definitions of temperature and thermal equilibrium are not as simple as they have been led to believe.
Our study of statistical physics comes at the end of the first semester, where it builds on a basis of quantum physics that is just previously established in the course. Students already know about quantized energy levels, the quantum particle in a box, and the basic features of the hydrogen atom. In the four lab-based sessions that follow these topics we introduce the fundamentals of statistical physics: microstates, entropy and temperature, the Boltzmann factor and ensemble averages.
The first session deals with a two-state system, couched in the language of choices among equal probability configurations. The students are confronted with the question of what keeps the air so uniformly distributed in a balloon if the molecules “choose” randomly whether to be on the left side or the right side. We model this by rolling large numbers of dice (flipping coins would work as well); each die represents a molecule and a result of 1, 2, or 3 corresponds to the left side and 4, 5, or 6 the right. Each group has ten dice that they roll ten times; they then histogram the results to give a frequency distribution of left/right splits for a ten-molecule system. Combining data from all eight groups gives the equivalent data for an eighty-molecule system, or can be combined differently to give 80 samples of the ten-molecule system. Students readily observe that the distribution becomes smoother with more data, and narrower for more molecules. Students then analytically describe all possible combinations and use a spreadsheet program to model those distributions. In particular, they verify that the relative width of the distribution is inversely proportional to the square root of the number of molecules, explaining the lack of fluctuations in macroscopic samples.
The second session extends the two-state system to define macrostates and microstates. The concrete analog used involves coins; the macrostate property reflects only the number of heads and tails, whereas the microstate identifies the state of each coin (penny, nickel, dime, quarter serve as convenient markers for this, although one could also use minting date, etc.). We revisit the probability discussions of the previous session in this new language: the probability of a macrostate is proportional to the number of microstates. I use a short diatribe on the difficulty of very large numbers to motivate the use of the logarithm to tame them, leading to the definition of the entropy. We calculate entropies of two-state systems using the Stirling approximation. At this point we touch base with quantum physics, and note that different quantum states in general have different energies, and adopt energy as our default macrostate marker. We then examine what happens to two systems that are allowed to exchange energy, using spreadsheet modeling. With the assumption that equilibrium occurs when probability (and therefore entropy) is maximized, we arrive at the standard statistical definition of the temperature as the inverse of the derivative of entropy with respect to energy. I take pains to point out this is not the same as average kinetic energy per atom.
The third session takes the definition of entropy and the assumption of probability being proportional to the number of microstates to derive the Boltzmann factor as the relative probability of two quantum states. This is found by taking the two-system model of the previous session and decreasing one of the systems to a single two-state atom, and expanding the other system to be so large as to make the average energy changes negligible. We then do several examples of converting relative probabilities for two-state and few-state systems into absolute probabilities, and name the normalizing factor the partition function.
The final session takes the notion of absolute probability and uses that to calculate an ensemble average energy. First this is done analytically for a simple two-state system, and then done computationally (using a spreadsheet) for the energies of a quantum particle in a one-dimensional box. This energy is convincingly close to 1/2NkT and avoids unpleasant integrations that most introductory-level students are unprepared to appreciate. It is argued that extension to three dimensions involves summation of three identical columns of weighted energies instead of just one, so the energy of the ideal monatomic gas is found. We relate energy changes to pressure as ∆E=P∆V, and the ideal gas law is derived. Students then experimentally verify this law (or equivalently, measure absolute zero) with some commercially available apparatus to measure pressure as a function of temperature.
Results and Future Directions
The question is naturally raised: do the students “get it?” Do they leave the course with an appreciation for the conceptual underpinnings of statistical physics? And do they learn a few practical results? Our experience leads us to answer with a qualified yes. There is no question that the students seem at least as capable of dealing with the abstract concepts of statistical physics as with other abstract concepts in introductory physics (electric fields and potentials come to mind). Certainly the effort to show that large numbers of particles convert scattered, random results (like rolling a die) into virtual certainty (like the uniformity of density of the air in a balloon) is successful. Students do seem to have an understanding of concepts such as the increase of entropy, the ideal gas law, and the Boltzmann factor, but many of our students have been exposed to these in other classes, either in high school, or in introductory chemistry (which most of our students take prior to this course).
Nevertheless, I believe the approach to these perhaps already familiar topics taken in this course is so fundamentally different from that provided in introductory chemistry as to provide useful insights into the underpinnings even for students who are quite familiar with the use of these concepts. Moreover, roughly half of our students will go on to see a more advanced treatment of statistical mechanics presented either in a physical chemistry or statistical physics class, and this approach gives an “F=ma” style introduction to the basic principles that leads usefully into a more advanced course. Our course also presents the unusual situation of having introduced the quantum particle in a box energy levels before approaching thermal physics. This is essential for the derivation of the ideal gas results.
Where might we go from here? It would be easy to lead from the results we develop in the four sessions to another session that deals with macroscopic effects such as efficiencies of heat engines, incorporating ideal gas results and inferring work from PV diagrams. One could also take a more fundamental (if less practical) step and connect the three major themes of the course (Newtonian mechanics, quantum mechanics, and statistical mechanics) and show how a combination of the latter two predicts the statistical results of the former, that is, that mechanical systems spend a larger fraction of the time at higher energy states (e.g. the harmonic oscillator spends more time at the extremes of motion where the velocity is low), even as dissipative processes cause them to settle into lower energy states. We are investigating each of these possibilities, although the already tight scheduling of the course makes an extensive expansion of the statistical portion of the course difficult without cutting other elements.
The activity guides (workbook-like materials) for the statistical physics section of the course, as well as all activity guides for both semesters of the course, are available as PDF files on the web3 or directly from the author. Early developments of activity-based statistical physics at Grinnell benefited from contributions by Dr. Andrew McDowell and Prof. Paul Tjossem.
 Priscilla W. Laws, “Workshop Physics,” Physics Today 44, 24-31 (Dec 1991). The Dickinson group also maintains a website related to the course http://physics.dickinson.edu/~wp_web/WP_homepage.html
 A good retrospective is available from L. A. Coleman, D. F. Holcomb, and J. S. Rigden, “The Introductory University Physics Project, 1987-1995: What has it accomplished?” Am. J. Phys. 66,124-137 (1998).
 All of the activity guides are available at the Grinnell College Physics Department curriculum development web site http://www.grinnell.edu/academic/physics/curricdev Mark B Schneider is Associate Professor of Physics at Grinnell College.