Comments
1 S,V,E,Nare all extensive and occur only in the combinationss = NS,v= VN, e= NE (‘specific quantities’).
2 The exponent32 reflects the fact, that each particle has 3 degrees of freedom.
3 Withλ:=4πme3h2
1
2 (‘thermal de Broglie’ wavelength) we get:
s= kB ln λv3
+ 52
4 This first non-trivial result from statistical physics has been known before in thermodynamics as‘Sackur-Tetrode-equation’. It has been impressively veri- fied in experiments.
In equilibrium,S=kB lnΩmust be maximal:
⇒ dS= ∂S1
∂E01 dE01+ ∂S2
∂E02 dE02
=−|{z}dE01
=! 0
= ∂S1
∂E10 − ∂S2
∂E20
| {z }
=0
dE10 =0
We define a new state variableT(‘temperature’) by:
∂S(E,V,N)
∂E V,N
:= 1
T(E,V,N)
⇒ 1
T1 − 1 T2
dE10 =0
⇒ T1 =T2
The two systems exchange energy until their temperatures are the same.
Usually the number of statesΩand therefore entropyS =kBlnΩincreases with energy Eand therefore 1/Tand with this alsoTwill be positive (e.g.S ∼ lnE3/2 for the ideal gas, see above). There are, however, completely reasonable models in statistical physics in whichΩcan go down withE, namely if the number of states has an upper limit, e.g.
in a finite-sized spin system or the two-state system from below. Then we formally get a negative temperature. Although this does not agree with our everyday life intuition about temperature, there is nothing wrong with such a result.
S(E) usually flattens with increasing energy (e.g. S ∼ lnE3/2 for the ideal gas, see above). This implies that high energy corresponds to high temperature, in agreement with our everyday life intuition about temperature.
In general, temperatureTdescribes the coupling between energy and entropy. Inverse temperature is the cost in entropy when buying a unit of energy from the environment.
Due to the equipartition theorem (compare the chapter on the canonical ensemble), temperature is often identified with kinetic energy; however, the temperature definition of statistical physics from above is much more general.
Before equilibrium is reached, entropy grows:
dS= 1
T1 − 1 T2
dE1 >0 T1 >T2 ⇒dE1<0
Hence we see that energy flows to the cooler system. Temperature defined this way agrees with our intuitive understanding of temperature.
If the two systems only exchange energy, then:
dEi =TidSi
⇒ dS2 = dE2
T2 =−dE1
T2 =−T1 T2 dS1
⇒ dS =dS1+dS2
=dS1
1− T1 T2
>0 before equilibrium is reached
T1 >T2 ⇒ dS1 <0, dS2>0, |dS2|>|dS1|
The warmer system loses entropy, the cooler system gains entropy. Overall more en- tropy is generated.
⇒ entropy is not conserved (unlike energyE) Contact with volume exchange
We now assume that the wall is also mobile, thus volume can be exchanged:
dS= ∂S1
∂E10 dE10 + ∂S2
∂E20 dE20
|{z}
=−dE10
+∂S1
∂V1
dV1+ ∂S2
∂V2
dV2
=−|{z}dV1
We define another new state variable (‘pressure’) by:
∂S(E,V,N)
∂V E,N
= p(E,V,N) T(E,V,N)
⇒ dS= 1
T1 − 1 T2
| {z }
=0
dE1+ p1
T1 − p2 T2
| {z }
=0
dV1=! 0
T1 =T2, p1 = p2
Volume is exchanged until the pressures are the same.
If temperatures are equal:
dS= p1−p2
T dV1>0 The system with larger pressure increases its volume.
This definition of pressure might seem a bit odd to you because it relates to entropy and not to energy, as you might have expected. Therefore we aim to rewrite it in terms
of energy. We start with a mathematical identity for any smooth function f(x,y)that is explained in the appendix:
∂f
∂x y
∂x
∂y f
∂y
∂f x
=−1
Note that this result might look odd if you like to think about differentials as real num- bers that you can cancel like in fractions. This result teaches you not to do this, and reflects the fact that for two positive changes you need one negative change to close a loop in(f,x,y)-space. We now apply this formula toS(E,V,N):
∂S
∂V E,N
∂V
∂E S,N
∂E
∂S V,N
= −1
⇒ p T
∂E
∂V S,N
!−1
∂S
∂E V,N
!−1
=−1 Noting that the last term simply givesT, we finally have
p=− ∂E
∂V S,N
Therefore pressure pcan also be interpreted as the increase in energy when reducing volume. This is closer to our intuition on pressure, but note that this should be done at constant entropy, which basically means without heat flux.
Contact with exchange of particle number
Finally let’s assume a permeable membrane and define a new state variableµ(‘chemical potential’) by:
∂S(E,V,N)
∂N E,V
=−µ(E,V,N) T(E,V,N) The equilibrium condition becomes:
µ1(E1,V1,N1) =µ2(E2,V2,N2) AssumeT1 =T2, butµ2 >µ1:
⇒ dS= (−µ1+µ2) dN1 T >0 dN1>0 ⇒ µ2> µ1
⇒ particles flow from large to small chemical potential
Again it might be more intuitive to rewrite the definition of chemical potential in terms of energy rather than entropy. We now apply our mathematical relation toS(E,N,V):
∂S
∂N E,V
∂N
∂E S,V
∂E
∂S N,V
=−1
⇒ −µ T
∂E
∂N S,V
!−1
∂S
∂E V,N
!−1
=−1
The last term again cancels theTin the first term and we thus get the final result µ= ∂E
∂N S,V
This means that the chemical potential is the energy cost when increasing particle num- ber (at constantSandV, that is without heat flux and with a constant volume).
Equations of state
We note that the three newly introduced variables:
T =T(E,V,N), p= p(E,V,N), µ=µ(E,V,N) defined by
dS= 1
T dE+ p
T dV− µ TdN
areintensive, that is their values do not change if the system is doubled, because other- wiseScould not be extensive.
Rearranging the equation above for dEgives:
dE=TdS−pdV+µdN
The pairs(T,S), ((−)p,V)and(µ,N)are‘conjugate’variables in regard to energy. We identify the three types of energies as heat, mechanical energy and chemical energy.
S = S(E,V,N)is the ‘fundamental equation’, containing the complete information on the system. The three equations for T, p and µare ‘equations of state’. Each by itself contains only incomplete information on the system. Typically the equations of state are experimentally accessible and thus ground our theory in experiments. If only some of them are known, the others have to be guessed based on some additional information (e.g. a model). Moreover thermodynamic relations give strong additional constraints on possible equations of state (see other chapter).