
Invitation to a Contemporary Physics (2004)
.pdf

436 General Concepts in Quantum Physics
position, its x-coordinate say, is determined to within an interval of uncertainty ∆x and if the corresponding x-component of its momentum p is determined simultaneously to within an interval of uncertainty ∆p, then the Heisenberg uncertainty principle asserts that the product of these two uncertainties will be greater than or equal to an irreducible minimum: ∆x∆p ≥ /2, where is Planck’s constant divided by 2π ( 10−34 Joule-second). The very act of measurement involved in determining any one of them perturbs the other su ciently so as to satisfy the above inequality, often by a wide margin, no matter how carefully and cleverly we design the measuring apparatus. This reciprocal latitude of fixation, or shall we say the ‘frustration of errors’ has observable consequences. For a particle of mass ‘m’ localized within a box measuring ∆x on the side, the uncertainty of momentum would be of the order of /∆x. The associated ‘zero-point’ kinetic energy would
|
2 |
˚ |
be about ( |
/∆x) |
/2m. Thus, for ∆x 1 A, the ‘zero-point’ energy for an electron |
is about 3 eV (1 electron-Volt = 1.6 × 10−19 Joule). This will exert a pressure on the walls of the cube of several million times atmospheric pressure. The lighter the particle, the greater is the zero-point energy. It is precisely this zero-point energy that prevents the inert gases of light atoms from solidifying even at the absolute zero of temperature. Thus, helium remains liquid down to the lowest temperatures known. True that hydrogen, the lightest of all elements, does form a solid at low enough temperatures, but the reason is that the attractive potential between the two hydrogen atoms is su ciently strong to ‘contain’ this zero-point motion. For the inert atoms like those of helium, the attraction is relatively much too weak. Again, it is this zero-point energy that prevents the electron in a hydrogen atom from collapsing onto the proton and staying stuck there. The uncertainty principle holds for other pairs of dynamical variables too, for example, energy and the life-time, or angle and the angular momentum.
B.2 Wave Function and Probability: State of the System
Having renounced the classical mechanical idea of sharply defined trajectories, the state of the particle is now given by a quantum mechanical wave function ψ(r, t) which is a complex function of position and time, i.e., ψ = |ψ|eiθ. The wave function has a statistical interpretation. Thus, |ψ(r, t)|2 gives the probability (density) of finding the particle at point r at time t, if such a measurement is made. The associated probability current (density) is given by ( /m) ·(|ψ|2) ·(gradient of phase θ). The wave function is to be determined in any specific case by solving a di erential equation, the Schr¨odinger equation that now replaces Newton’s equations of motion. The point is that ψ contains all the observable information about the particle motion. Thus, the classical observables like energy, momentum, angular momentum, etc., retain their usual meaning, but are now obtained from the wave function through certain unusual set of rules prescribed by quantum mechanics.

B.2. Wave Function and Probability: State of the System |
437 |
For the simplest case of a freely moving particle, the wave function is a plane wave, ψ ei4p·4r/ . It has a well-defined momentum pN and therefore, as demanded by quantum mechanics, its position is completely uncertain, i.e., |ψ|2 is the same for all points. This ‘matter wave,’ or the de Broglie wave, carries an energy p2/2m as in classical particle mechanics. It has a wavelength h/p. In point of fact, except for its probabilistic significance, this wave is like any other wave motion. Thus, for a
˚
free electron of energy 1 eV, the de Broglie wavelength is about 12 A, much shorter than the wavelength of ordinary light. A shorter wavelength means a higher spatial resolution. This is the idea underlying the high resolution electron-microscope where light is replaced by high energy electrons.
For the not-so-simple case of a hydrogen atom, where an electron is bound to the proton by Coulomb attraction, much the same way as planets are gravitationally bound to the sun, the solution of the Schr¨odinger equation gives stationary states that roughly correspond to the elliptical orbits of the planets around the sun. The set of these stationary states is, however, discrete corresponding to only certain allowed values of energy, angular momentum and its component along a chosen direction. Accordingly, these states are conveniently labeled by certain integers, called quantum numbers. Thus, we write ψnlm. Here n (= 1, 2, . . .) is the principal quantum number giving the energy En = −1/n2 in Rydberg units (1 Rydberg = 13.6 eV); (= 0, 1, 2, . . . , n − 1) is the angular momentum quantum number and gives the angular momentum in units of ; m (− , − + 1, . . . , −1, ) is the component of angular momentum along any arbitrarily chosen axis. In addition to these negative-energy (i.e., bound) states, we also have the positive-energy ‘scattering’ states having a continuous range of energy from zero to infinity — these would correspond to the case of some comets that are merely deflected by the sun into open hyperbolic orbits, and not bound by it.
The same principles, of course, apply to more complex atoms, molecules, bulk matter and sub-nuclear matter. Quantum mechanics has unrestricted validity. Thus, the electromagnetic waves (light), classically described by Maxwell’s equations, must also be ‘quantized.’ The resulting quantum is a ‘packet’ of energy called a photon. If λ is the wavelength (with frequency ν) of light, then the photon carries energy hc/λ, or hν where c is the speed of light. It also carries a momentum h/λ. Interaction of quantized radiation with matter involves absorption (emission) of a radiation quantum (hν) accompanied by an electronic transition from a lower (higher) energy state 1 (2) to a higher (lower) energy state 2 (1) such that hν = E2 − E1 (E1 − E2). This is the origin of energy ‘spectrum’ characteristic of atoms, molecules, etc. Similarly, the sound wave-like oscillatory motion of atoms in a solid also must be quantized — the resulting quantum is the phonon.
Certain processes, forbidden classically, can take place quantum mechanically. Thus, a particle having an energy which is less than a potential barrier will be classically reflected back by it — it cannot escape over the barrier. The all-pervasive waviness (or fuzziness) of the quantum wave function enables the particle to take

438 |
General Concepts in Quantum Physics |
the barrier, so to speak, in its stride — it can ‘tunnel’ through it even at subbarrier energies. This is what enables an electron to jump across thin insulating layers separating metallic or superconducting electrodes.
B.3 Superposition of States and Wave Interference
This is a characteristic feature of quantum mechanical waves. If ψ1 and ψ2 are the two possible states, then their linear combination (superposition) a1ψ1 + a2ψ2 is also a possible state. It is clear that |ψ|2 will then contain a cross-term (interference term) whose sign will depend on the relative phases of the two complex components, leading to constructive or destructive interference. Thus, these probability waves interfere and di ract just as any other wave; hence, the phenomenon of electron or neutron di raction which is of great practical use in studying crystal structures
— the latter act as di raction gratings. It is important to note that a particlewave interferes with itself — in a Young’s double-slit experiment the electron can propagate through the two slits as alternatives, and the interference is between these two alternatives.
B.4 Indistinguishability of Identical Particles
Classically, identical particles remain distinguishable even if only in virtue of their being spatially separated. For, in principle, we can keep track of which is which by following their trajectories continuously. This is obviously not allowed in quantum mechanics. Thus, if ψ(r1, r2) is the wave function of two identical particles then |ψ(r1, r2)|2 only tells us the probability density for finding one of them at r1 and the other at r2 without specifying which one. It follows then that under interchange of the labels 1 and 2, the wave function either remains the same (symmetrical) or changes sign (anti-symmetrical). The particles obeying the ‘symmetrical statistics’ are called bosons, and those obeying the ‘anti-symmetrical statistics’ are called the fermions. For identical fermions, not more than one particle can occupy a given one-particle state. For bosons, any occupation number is permitted. It turns out that particles having intrinsic spin angular momentum half-integral (1/2, 3/2, . . .) in units of are fermions, e.g., the electron, the isotope 3He, etc. Particles with integral spin are bosons, e.g., the photon, 4He, etc. Here spin refers to an intrinsic angular momentum that a particle may have. This may be likened to a particle spinning about its axis just as the earth spins about its axis.
This indistinguishability (symmetry or anti-symmetry) under interchange gives rise to purely quantum e ects which are pronounced when the amplitude of ‘zeropoint’ motion is comparable with the inter-particle spacing. This happens for electrons in metals, or for helium atoms in liquid helium. We call these quantum fluids. Superconductivity or superfluidity of these fluids is an important consequence of

B.5. Quantum Mechanics and Naive Realism |
439 |
quantum statistics. The same indistinguishability also gives rise to the ‘exchange interaction’ responsible for magnetic ordering.
B.5 Quantum Mechanics and Naive Realism
Quantum mechanics not only forbids the simultaneous measurement of position and momentum (and, therefore, trajectories), it even denies their objective existence, independent of our measurement, as physically meaningless. Thus, prior to measurement, an electron in a room remains in a state of ‘potentiality’ (the probabilistic wave function), which ‘collapses’ to a certain point when it is detected on a photographic plate, say. All this is philosophically very disturbing. Attempts have been made to introduce variables ‘hidden’ from our reckoning that make the observed particles behave apparently probabilistically while the entire system of particle-plus- hidden variables is deterministic in the spirit of classical statistical mechanics. But crucial experiments have ruled out all such ‘hidden variable’ theories so far. There are paradoxes. There is philosophical uneasiness. But the detailed agreement of quantum predictions with experiments has muted much of the criticism so far.
This page intentionally left blank


442 Thermal Physics and Statistical Mechanics
orderliness (or the lack of it) of the constituents of matter. This is called entropy, and is usually denoted by S: a substance in a state with a larger amount of entropy is less orderly than when it is in a state with less entropy. Looking at the example of ice and water, we can also understand entropy in another (more quantitative) way. Fix the temperature T , in this case at 0◦C. To freeze water into ice, you must put it in a freezer to extract out the latent heat in the water. As a result, the final product (ice) becomes more orderly with less entropy, so there must be a connection between the decrease of entropy S at this fixed temperature T (and a fixed volume) and the heat that is extracted from the liquid water to turn it into ice. In fact this amount of heat is precisely T S, and this can be used as the definition of entropy: at a fixed temperature T , the decrease of entropy (S) is given by the amount of heat extracted from the substance divided by the temperature T .
If we ignore the influence of heat, as we usually do in studying mechanics, an object becomes stable when it settles down at the bottom of the potential well, where its potential energy U is minimal. When heat is taken into account, then we must require the total energy in a given system, viz., the sum of the potential energy U and the inflow of heat energy −T S, to be minimal for thermodynamic stability. This sum F = U −T S is called the free energy; it replaces the mechanical energy U in the absence of heat.
This example of water and ice illustrates another important aspect of thermodynamics, that a substance can have di erent phases. For H2O at normal atmospheric pressure, it is in the steam phase above 100◦C, in the water phase between 0◦C and 100◦C, and in the ice phase below 0◦C. The temperatures 100◦C and 0◦C at which a phase changes are called the transition temperatures. As a rule, the phase at a lower temperature is usually more orderly than one at a higher temperature. Another way of saying the same thing is that the phase at a lower temperature usually has less symmetry (see Chapter 1) than one at a higher temperature. This last statement may seem contradictory, but it is not. A drop of water is spherical and it has spherical symmetry — or symmetry upon rotation for any amount about any direction. A piece of ice is a crystal, so we must confine the rotation to some specific amounts about some specific directions in order to move one atom to the position of another atom, and hence its symmetry is smaller.
This phenomenon of phase transition occurs not only in water, but also in most other substances and systems. For example, the change from a normal metal to a superconductor at a low temperature is a phase transition (see Chapter 3). As the universe cooled down from its Big Bang beginning towards its present temperature of 2.7 K (see Chapter 10), various phase transitions are believed to have taken place, so that the universe we see today is not in the same phase as the universe that was in the very beginning; at the early epoch the universe was much more symmetrical. Phase transitions are important not only in superconductivity (Chapter 3) and in cosmology (Chapter 10), but also in the modern theory of elementary particle physics (Chapter 9).

C.2. Statistical Mechanics |
443 |
C.2 Statistical Mechanics
At thermal equilibrium, each constituent carries the same amount of energy on average, but this does not mean that the energy of every constituent is identical at any given time — only the average over time is the same. To obtain the energy spread about the mean, or the energy distribution, we must employ the full apparatus of statistical mechanics, which is beyond the scope of this book. Nevertheless, it is not di cult to describe and to understand the outcome of these calculations.
Energy distribution depends on whether quantum e ects (see Appendix B) are important or not. According to quantum mechanics, a particle in a given volume with energy lying within a specific range can occupy only one of a finite number of quantum states. How these quantum states may be occupied depends on what type of particles they are. Particles of this universe are of two types: they are either bosons or fermions. Two bosons of the same kind (e.g., two photons) have an additional tendency to occupy the same quantum state. It is this remarkable property that leads to the feasibility of a laser (see Chapter 2) and the remarkable phenomena of superfluidity and superconductivity (see Chapter 3). On the other hand, fermions of the same kind (e.g., two electrons) are absolutely forbidden from occupying the same quantum state. It is this special property that leads to the distinction between a conductor and an insulator, and prevents the collapse of a white dwarf or a neutron star (see Chapter 8).
If the total number of particles in the macroscopic system is small compared to the number of quantum states available, then it would be very unlikely for two particles to occupy the same quantum state anyway, in which case whether these particles are bosons or fermions is immaterial. The energy distribution then follows what is known as the Boltzmann (or Maxwell–Boltzmann) distribution. At a given (absolute) temperature T , the average energy per degree of freedom for a nonrelativistic particle is 12 kT , where k = 1.38×10−23 J/K (Joules per Kelvin) is known as the Boltzmann constant. What this says is that the average energy per degree of freedom at T = 1 K is 0.69 × 10−23 J; at T = 300 K it is 2.07 × 10−21 J. Sometimes units are chosen so that k = 1. In that case we simply identify each degree Kelvin as an energy unit of 1.38 × 10−23 J, and the energy per degree of freedom becomes simply 12 T . Since one Joule is equal to 6.24 × 1018 eV (electron-Volt), each degree of temperature can also be thought of as carrying the energy of 8.617 × 10−5 eV, or 86µ eV (micro-electron-Volt, or 10−6 eV). Thus at a room temperature of 27◦C, or T = 300 K, the energy per degree of freedom will be 12 T = 150 K = 0.013 eV.
So far we have not explained what a ‘degree of freedom’ is. A point particle can move in any one of the three spatial directions and is counted to have three degrees of freedom; a rigid body of finite size can do so as well as rotating about each of the three axes pointing at the three possible directions, so it has six degrees of freedom. The average energy per point particle is therefore 32 kT , and that for a rigid body is therefore 3kT .

444 |
Thermal Physics and Statistical Mechanics |
dN/d3p
3 |
kT |
|
2 |
|
|
Figure C.1: Boltzmann distribution, depicting the number of particles per unit momentum volume (dN/d3p) with an energy at a given temperature T . The average energy per point particle is (3/2)kT .
We have also not explained what a ‘Boltzmann distribution’ is. This distribution is shown in Fig. C.1, in which the number of particles per unit momentum volume is plotted as a function of the particle energy.1 The average energy of each nonrelativistic point particle, as mentioned before, is 12 kT . The Boltzmann distribution of other objects is qualitatively similar.
We have so far considered only the Boltzmann distribution, which is valid when the number of particles N in the system of volume V is so small that it is improbable for any two of them to occupy the same quantum state. If this is not so, then quantum e ects become important and we must distinguish bosons from fermions.
In order to decide whether quantum e ects are important or not, we must first ask: ‘how small is small’? That depends on the temperature T of the system, because the number of quantum states N0 present depends on it. To see that, recall that the energy of the particle depends on the temperature. Energy is related to momentum, and according to quantum mechanics, momentum is related to the wavelength λ of the particle,2 which measures the e ective size of that particle. Thus, the e ective volume λ3 occupied by one such particle is temperaturedependent, and so is the number of quantum states N0 = V/λ3.
Thus, whether quantum e ects are important or not depends on whether N is large or small compared to N0, and this in turn depends on the temperature T of the macroscopic body. At high temperatures, the wavelength λ is small, the number of quantum states N0 is large compared to N, and quantum e ects are relatively unimportant. At low temperatures, the opposite is true.
The temperature below which quantum e ects become important is called the degeneracy temperature, and will be denoted by T0. The energy distribution below
1The mathematical formula is dN/d3p = C exp(− /kT ), where dN is the number of particles in the momentum volume d3p whose energy is . C is a normalization constant determined by the total number of particles in the system.
2The formula for λ is: λ = (2π/mkT )1/2, where m is the mass of the particle. This formula may
be understood as follows. The average energy of a particle at temperature T is of order kT , so its average momentum is of order (2mkT )1/2, and its average wavelength according to quantum
mechanics is of order λ given above. The exact numerical coe cient appearing in the expression for λ can be obtained only through detailed calculations.