Over the past 12 years, I have developed and taught at Grinnell a two-semester general physics course that has attempted to merge the techniques of Priscilla Laws’ Workshop Physics At Grinnell, we do not teach an algebra based physics course. A couple of decades ago, we dropped the algebra based course when it was discovered that even among the dwindling enrollment in that class, fewer than a handful of students failed to have the minimal calculus co-requisite. I don’t think Grinnell’s faculty is unusual in being pleased with that change; algebra based physics is forced to rely on Our study of statistical physics comes at the end of the first semester, where it builds on a basis of quantum physics that is just previously established in the course. Students already know about quantized energy levels, the quantum particle in a box, and the basic features of the hydrogen atom. In the four lab-based sessions that follow these topics we introduce the fundamentals of statistical physics: microstates, entropy and temperature, the Boltzmann factor and ensemble averages.
The first session deals with a two-state system, couched in the language of choices among equal probability configurations. The students are confronted with the question of what keeps the air so uniformly distributed in a balloon if the molecules “choose” randomly whether to be on the left side or the right side. We model this by rolling large numbers of dice (flipping coins would work as well); each die represents a molecule and a result of 1, 2, or 3 corresponds to the left side and 4, 5, or 6 the right. Each group has ten dice that they roll ten times; they then histogram the results to give a frequency distribution of left/right splits for a ten-molecule system. Combining data from all eight groups gives the equivalent data for an eighty-molecule system, or can be combined differently to give 80 samples of the ten-molecule system. Students readily observe that the distribution becomes smoother with more data, and narrower for more molecules. Students then analytically describe all possible combinations and use a spreadsheet program to model those distributions. In particular, they verify that the relative width of the distribution is inversely proportional to the square root of the number of molecules, explaining the lack of fluctuations in macroscopic samples. The second session extends the two-state system to define macrostates and microstates. The concrete analog used involves coins; the macrostate property reflects only the number of heads and tails, whereas the microstate identifies the state of each coin (penny, nickel, dime, quarter serve as convenient markers for this, although one could also use minting date, etc.). We revisit the probability discussions of the previous session in this new language: the probability of a macrostate is proportional to the number of microstates. I use a short diatribe on the difficulty of very large numbers to motivate the use of the logarithm to tame them, leading to the definition of the entropy. We calculate entropies of two-state systems using the Stirling approximation. At this point we touch base with quantum physics, and note that different quantum states in general have different energies, and adopt energy as our default macrostate marker. We then examine what happens to two systems that are allowed to exchange energy, using spreadsheet modeling. With the assumption that equilibrium occurs when probability (and therefore entropy) is maximized, we arrive at the standard statistical definition of the temperature as the inverse of the derivative of entropy with respect to energy. I take pains to point out this is The third session takes the definition of entropy and the assumption of probability being proportional to the number of microstates to derive the Boltzmann factor as the relative probability of two quantum states. This is found by taking the two-system model of the previous session and decreasing one of the systems to a single two-state atom, and expanding the other system to be so large as to make the average energy changes negligible. We then do several examples of converting relative probabilities for two-state and few-state systems into absolute probabilities, and name the normalizing factor the partition function. The final session takes the notion of absolute probability and uses that to calculate an ensemble average energy. First this is done analytically for a simple two-state system, and then done computationally (using a spreadsheet) for the energies of a quantum particle in a one-dimensional box. This energy is convincingly close to
The question is naturally raised: do the students “get it?” Do they leave the course with an appreciation for the conceptual underpinnings of statistical physics? And do they learn a few practical results? Our experience leads us to answer with a qualified yes. There is no question that the students seem at least as capable of dealing with the abstract concepts of statistical physics as with other abstract concepts in introductory physics (electric fields and potentials come to mind). Certainly the effort to show that large numbers of particles convert scattered, random results (like rolling a die) into virtual certainty (like the uniformity of density of the air in a balloon) is successful. Students do seem to have an understanding of concepts such as the increase of entropy, the ideal gas law, and the Boltzmann factor, but many of our students have been exposed to these in other classes, either in high school, or in introductory chemistry (which most of our students take prior to this course). Nevertheless, I believe the approach to these perhaps already familiar topics taken in this course is so fundamentally different from that provided in introductory chemistry as to provide useful insights into the underpinnings even for students who are quite familiar with the use of these concepts. Moreover, roughly half of our students will go on to see a more advanced treatment of statistical mechanics presented either in a physical chemistry or statistical physics class, and this approach gives an “ Where might we go from here? It would be easy to lead from the results we develop in the four sessions to another session that deals with macroscopic effects such as efficiencies of heat engines, incorporating ideal gas results and inferring work from
The activity guides (workbook-like materials) for the statistical physics section of the course, as well as all activity guides for both semesters of the course, are available as PDF files on the web
[1] Priscilla W. Laws, “Workshop Physics,” Physics Today [2] A good retrospective is available from L. A. Coleman, D. F. Holcomb, and J. S. Rigden, “The Introductory University Physics Project, 1987-1995: What has it accomplished?” Am. J. Phys. [3] All of the activity guides are available at the Grinnell College Physics Department curriculum development web site http://www.grinnell.edu/academic/physics/curricdev Mark B Schneider is Associate Professor of Physics at Grinnell College. |