Statistical mechanics provides the framework for studying large collections of molecules and tells us how to average over positions and velocities to properly simulate the laboratory distribution of molecules When dealing with a sample containing a large number (e.g., billions) of molecules, it is impractical to attempt to monitor the time evolution of the coordinates and momenta (or analogous quantum state populations) of the individual constituent molecules. Especially for systems at or near equilibrium, these properties of individual molecules may vary irregularly on very short time scales (as the molecules undergo frequent collisions with neighbors), but the average behavior (e.g, average translational energy or average population of a particular vibrational energy level) changes much more gently with time. Such average properties can be viewed as averages for any one molecule over a time interval during which many collisions occur or as averages over all the molecules in the sample at any specific time. By focusing on the most probable distribution of the total energy available to a sample containing many molecules and by asking questions that relate to the average properties of the molecules, the discipline of statistical mechanics provides a very efficient and powerful set of tools (i.e., equations) for predicting and simulating the properties of such systems. In a sense, the machinery of statistical mechanics allow one to describe the "most frequent" behavior of large molecular systems; that is how the molecules are moving and interacting most of the time. Fluctuations away from this most probable behavior can also be handled as long as these fluctuations are small. 1. The framework of statistical mechanics provides efficient equations for computing thermodynamic properties from molecular properties a. The Boltzmann population equation The primary outcome of asking what is the most probable distribution of energy among a large number N of molecules within a container of volume V that is maintained at a specified temperature T is the most important equation in statistical mechanics, the Boltzmann population formula: Pj = Wj exp(- Ej /kT)/Q, where Ej is the energy of the jth quantum state of the system (which is the whole collection of N molecules), T is the temperature in K, Wj is the degeneracy of the jth state, and the denominator Q is the so-called partition function: Q = Sj Wj exp(- Ej /kT). The classical mechanical equivalent of the above quantum Boltzmann population formula for a system with M coordinates (collectively denoted q and M momenta denoted p) is: P(q,p) = h-M exp (- H(q, p)/kT)/Q, where H is the classical Hamiltonian, and the partition function Q is Q = h-M Ú exp (- H(q, p)/kT) dq dp . b. The limit for systems containing many molecules Notice that the Boltzmann formula does not say that only those states of a given energy can be populated; it gives non-zero probabilities for populating all states from the lowest to the highest. However, it does say that states of higher energy Ej are disfavored by the exp (- Ej /kT) factor, but if states of higher energy have larger degeneracies Wj (which they usually do), the overall population of such states may not be low. That is, there is a competition between state degeneracy, which tends to grow as the state's energy grows, and exp (-Ej /kT) which decreases with increasing energy. If the number of particles N is huge, the degeneracy W grows as a high power (say M) of E because the degeneracy is related to the number of ways the energy can be distributed among the N molecules; in fact, M grows at least as fast as N. As a result of W growing as EM , the product function P(E) = EM exp(-E/kT) has the form shown below (for M=10). By taking the derivative of this function P(E) with respect to E, and finding the energy at which this derivative vanishes, one can show that this probability function has a peak at E* = MkT, and that at this energy value, P(E*) = (MkT)M exp(-M), By then asking at what energy E' the function P(E) drops to exp(-1) of this maximum value P(E*) P(E') = exp(-1) P(E*), one finds E' = MkT (1+ ). So the width of the P(E) graph, measured as the change in energy needed to cause P(E) to drop to exp(-1) of its maximum value divided by the value of the energy at which P(E) assumes this maximum value is (E'-E*)/E* =. This width gets smaller and smaller as M increases. So, as the number N of molecules in the sample grows, which causes M to grow as discussed earlier, the energy probability functions becomes more and more sharply peaked about the most probable energy E*. It is for the reasons just shown that for so-called macroscopic systems, in which N (and hence M) is extremely large (i.e., 10L with L being ca. 10-24), only the most probable distribution of the total energy among the N molecules need be considered that the equations of statistical mechanics are so useful. Certainly, there are fluctuations (as evidenced by the finite width of the above graph) in the energy content of the molecular system about its most probable value. However, these fluctuations become less and less important as the system size (i.e., N) becomes larger and larger. c. The connection with thermodynamics What are some of these equations? The first is the fundamental Boltzmann population formula shown earlier: Pj = Wj exp(- Ej /kT)/Q. Using this result, it is possible to compute the average energy of a system = Sj Pj Ej , and to show that this quantity can be recast (try to do this derivation; it is not that difficult) as = kT2 (lnQ/T)N,V . If the average pressure

is defined as the pressure of each quantum state pj = (Ej /V)N multiplied by the probability Pj for accessing that quantum state, summed over all such states, one can show that

= Sj (Ej /V)N Wj exp(- Ej /kT)/Q = kT(lnQ/V)N,T . Without belaboring the point much further, it is possible to express all of the usual thermodynamic quantities in terms of the partition function Q. The average energy and average pressure are given above; the average entropy is given as = k lnQ + kT(lnQ/N)V,T . So, if one were able to evaluate the partition function Q for N molecules in a volume V at a temperature T, either by summing the quantum-state degeneracy and exp(-Ej/kT) factors Q = Sj Wj exp(- Ej /kT) or by evaluating the classical phase-space integral (phase space is the collection of coordinates and conjugate momenta) Q = h-M Ú exp (- H(q, p)/kT) dq dp, one could then compute all thermodynamic properties of the system. This is the essence of how statistical mechanics provides the tools for connecting the molecule-level properties, which ultimately determine the Ej and the Wj, to the macroscopic properties such as , ,

, etc. 2. Statistical mechanics gives equations for probability densities in coordinate and momentum space Not only is statistical mechanics useful for relating thermodynamic properties to molecular behavior but it is also necessary to use in molecular dynamics simulations. When one attempts, for example, to simulate the reactive collisions of an A atom with a BC molecule to produce AB + C, it is not appropriate to consider a single classical or quantal collision between A and BC. Why? Because in any laboratory setting, 1. The A atoms are moving toward the BC molecules with a distribution of relative speeds. That is, within the sample of molecules (which likely contains 1010 or more molecules), some A + BC pairs have low relative kinetic energies when they collide, and others have high kinetic energies. There is a probability distribution P(EKE ) for this relative kinetic energy. 2. The BC molecules are not all in the same rotational (J) or vibrational (v) state. There is a probability distribution function P(J,v) describing the fraction of BC molecules that are in a particular J state and a particular v state. 3. When the A and BC molecules collide with a relative motion velocity vector v, they do not all hit "head on". Some collisions have small impact parameter b (the closest distance from A to the center of mass of BC if the collision were to occur with no attractive or repulsive forces), and some have large b-values (see below). The probability function for these impact parameters is P(b) = 2p b db, which is simply a statement of the geometrical fact that larger b-values have more geometrical volume element than smaller b-values. So, to simulate the entire ensemble of collisions that occur between A atoms and BC molecules in various J, v states and having various relative kinetic energies EKE and impact parameters b, one must: 1. run classical trajectories (or quantum propagations) for a large number of J, v, EKE , and b values, 2. with each such trajectory assigned an overall weighting (or importance) of Ptotal = P(EKE ) P(J,v) 2pb db. 3. Present Day Challenges in Statistical Mechanics In addition to the scientists whose work is discussed explicitly below or later in this section, the reader is encouraged to visit the following web sites belonging to other leaders in the area of statistical mechanics. I should stress that the "dividing line" between chemical dynamics and statistical mechanics has become less clear in recent years, especially as dynamics theory has become more often applied to condensed-media systems containing large numbers of molecules. For this reason, one often finds researchers who are active in the chemical dynamics community producing important work in statistical mechanics and vice versa. One of the most active research areas in statistical mechanics involves the evaluation of so-called time correlation functions. The correlation function C(t) is defined in terms of two physical operators A and B, a time dependence that is carried by a Hamiltonian H via exp(-iHt/h), and an equilibrium average over a Boltzmann population exp(-bH)/Q. The quantum mechanical expression for C(t) is C(t) = Sj exp(-bEj)/Q, while the classical mechanical expression is C(t) = Ú dq Ú dp A(q(0),p(0)) B(q(t),p(t)) exp(-bH(q(0),p(0)))/Q, where q(0) and p(0) are the values of all the coordinates and momenta of the system at t=0 and q(t) and p(t) are their values, according to Newtonian mechanics, at time t. An example of a time correlation function that relates to molecular spectroscopy is the dipole-dipole correlation function: C(t) = Sj exp(-bEj)/Q, for which A and B are both the electric dipole interaction e•m between the photon's electric field and the molecule's dipole operator. The Fourier transform of this particular C(t) relates to the absorption intensity for light of frequency w: I(w) µÚ dt C(t) exp(iwt). The computation of correlation functions involves propagating either wavefunctions or classical trajectories. In the quantum case, one faces many of the same difficulties discussed earlier in Sec. III.B.2. However, one aspect of the task faced in the equilibrium averaging process that is also included in C(t) makes this case somewhat easier than in the quantum dynamics case. To illustrate, consider the time propagation issue contained in the quantum definition of C(t). One is faced with 1. propagating |yj > from t=0 up to time t, using exp(-iHt/h) |yj > and then multiplying by B 2. propagating A+ |yj > from t=0 up to time t, using exp(-iHt/h)A+ |yj >. The exp(-bH) operator can be combined with the first time propagation step as follows: exp(-iHt/h) |yj > exp(-bEj)/Q = exp(-iHt/h) exp(-bH) |yj >/Q =exp(-i/h[t+bh/i]H) |yj>/Q. Doing so introduces a propagation in complex time from t = 0 +bh/i to t = t + bh/i. The Feynman path integral techniques can now be used to carry out this propagation. One begins, as before, by dividing the time interval into N discrete steps exp[-i Ht/h ] = {exp[-i Ht/Nh ]}N . and then utilizing the same kind of short-time split propagator methodology described earlier in our discussion of quantum dynamics and Feynman path integrals. Unlike the real-time propagation case, which is plagued by having to evaluate sums of oscillatory functions, the complex-time propagations that arise in statistical mechanics (through the introduction of t = t + bh/i) are less problematic. That is, the quantity exp[-i Ejt/Nh ] = exp[-i Ejt/Nh ] exp[- Ejb/N] contains an exponential "damping factor" exp[- Ejb/N] in the complex-time case that causes the evaluation of correlation functions to be less difficult than the evaluation of real-time wavefunction propagation. For a good overview of how time correlation functions relate to various molecular properties, I urge you to look at McQuarrie's text book on Statistical Mechanics. Other areas of active research in statistical mechanics tend to involve systems for which the deviations from "ideal behavior" (e.g., dilute gases whose constituents have weak intermolecular forces, nearly harmonic highly ordered solids, weakly interacting dilute liquid mixtures, species adsorbed to surfaces but not interacting with other adsorbed species, highly ordered polymeric materials) are large. Such systems include glasses, disordered solids, polymers that can exist in folded or disentangled arrangements, concentrated electrolyte soltions, solvated ions near electrode surfaces, and many more. In addition to those scientists already mentioned above, below, I show several of the people who are pursuing research aimed at applying statistical mechanics to such difficult problems.