Let (M,ω) be a symplectic manifold of dimension 2n, and let H:M→R be a classical Hamiltonian. The symplectic form ω allows us to define a measure on M, given by integration against the top form ωn/n!. We will denote this measure by dμ.
We imagine that (M,ω,H) represents some classical mechanical system. We suppose that the dynamics of this dynamical system are very complicated, e.g. some system of 1023 particles. The system is so complicated that not only can we not solve the equations of motion exactly, and even if we could, their solutions might be so complicated that we can't expect to learn very much from them.
So instead, we ask statistical questions. Imagine that we cannot measure the state of the system exactly (e.g. particles in a box), so we try to guess a probability distribution ρ(x,p,t) on M indicating that at time t the system has probability ρ(x,p,t)dμ of being in the state (x,p). Obviously, ρ should satisfy the constraint ∫Mρdμ=1.
How does ρ evolve in time? We know that the system obeys Hamilton's equations,
(˙x,˙p)=XH=(∂H/∂p,−∂H/∂x)
in local Darboux coordinates. Therefore, a particle located at (x,p) in phase space at time t will be located at (x,p)+XHdt in phase space at time t+dt. Therefore, the probability that a particle is at point (x,p) at time t+dt, should be equal to the probability that the particle is at point (x,p)−XHdt at time t. Therefore, we have
∂ρ∂t=∂H∂x∂ρ∂p−∂H∂p∂ρ∂x={H,ρ}
Given a probability distribution ρ, the entropy is defined to be
S[ρ]=−∫Mρlogρdμ.
(A version of) the second law of thermodynamics. For a given average energy U, the system assumes a distribution of maximal possible entropy at thermodynamic equilibrium.
The goal now, is to determine what distribution ρ will maximize the entropy, subject to the constraints (for fixed U)
∫MHρdμ=U ∫Mρdμ=1
Setting aside technical issues of convergence, etc., this variational problem is easily solved using the method of Lagrange multipliers. Introducing parameters λ1,λ2, we consider the modified functional
S[ρ,λ1,λ2,U]=∫M(−ρlogρ+λ1ρ+λ2(Hρ))dμ−λ1−λ2U.
Note that ∂S/∂U=−λ2, and this is conventionally identified with (minus) the inverse temperature.
Taking the variation with respect to ρ, we find
0=δSδρ=−logρ−1+λ1+Hλ2
Therefore, rho is proportional to e−βH where we have set β=−λ2. Define the partition function Z to be
Z=∫Me−βHdμ.
We therefore have proved (formally and heuristically only!):
Theorem. The probability distribution ρ assumed by the system at thermodynamic equilibrium is given by
ρ=e−βHZ
where β>0 is a real parameter, called the inverse temperature.
Corollary. At thermodynamic equilibrium, the average energy is given by
U=−∂logZ∂β,
and the entropy is given by
S=βU+logZ.
No comments:
Post a Comment