Some Observations of Modern Physics
The Cycle of the Quanta
The reason we can observe anything at all is because ordinary
matter emits and absorbs energy in the form of radiation. It’s a commonplace
observation that if you heat a piece of metal the radiation (light) it gives off
will change with the temperature, for example, going from red-hot to white-hot
with increasing temperature. At the end of the 19
th century some of
my physicist ancestors (O. Lummer, R. Pringsheim, H. Rubens, F. Kurlbaum, H.
Beckman, et. al.) made
close and careful
observations of just exactly how the color (energy distribution) of the
light changed with the temperature of a special kind of emitting body known as a
black body (one that absorbs all light falling on it). At about the turn of the
century, Max Planck, did a
careful analysis of these observations and came to the conclusion that the radiation
was emitted and absorbed by the black body in discrete quantities and not in the
continuous fashion everyone had been assuming. The scale of the discreteness
measured in this system was established by the size of a constant, now known as
Planck’s constant, h. This constant has a
numerical value of 4.136 x 10
-15 eV-seconds. [The eV is a unit of energy corresponding
to the amount of energy that an electron (the carrier of the smallest quantum of
negative electrical charge ever observed) acquires when placed in an electric
potential of 1 Volt. (A single AA battery produces a potential of 1.5 Volts.)] A
few years later (1904) Albert Einstein made a
prediction based on an extrapolation
of Planck’s analysis that was to change the history of human thought.
He asked: What if this quantization is not just in the emitting and absorbing
body but is actually in the radiation (light) itself? The Scottish
mathematician, James Maxwell, had demonstrated some 30 years earlier, that light
is a wave moving with a velocity, c, in the electromagnetic field produced by
bodies that were magnetized and/or carried electric charges and that were either
stationary or were moving. So Einstein’s question could equally well be phrased:
What if the electromagnetic field is quantized? Adopting this point of view,
light is a stream of quanta, now known as photons, and even though they have no
mass they carry an energy, E, related to the frequency of the light wave,f, by
the simple formula E= hf. There were some other careful observations floating
around (by experimentalists like Hertz and Tesla) that showed that light
effected the electro-magnetic fields in certain arrangements of conducting
surfaces (like spark-gaps). This effect is known as the photo-electric effect.
Einstein predicted that the photo-electric effect would result in the emission
of electrons from the surfaces and that the energy of these electrons would be
equal to the energy of the photons (minus some energy required to remove the
electron from the forces holding it to the surface, called the work function).
In quantitative terms and assuming a zero work-function, green light with a
frequency of 540 THz (a THz is a trillion Hz, a Hz is 1 wave-cycle/second) corresponding to
a wavelength of 550 nm ( a nm is a billionth of a meter) would cause electrons with 2 eV of energy to be ejected,
ultraviolet light of 310 nm would eject 4 eV electrons. These kinds of
predictions were verified in experiments done at Caltech by Robert Millikan (who
also measured the charge of the electron) in 1915. This understanding of the
photo-electric effect forms the basis of today’s technology of solar energy.
It's also what makes your digital camera work.
This cycle of observation-analysis-prediction and back to
observation led to the birth of quantum physics. It was a difficult birth and
some of the complications persist to this day. Most of it stems from the fact
that there are many excellent experiments demonstrating that light is a wave.
This gave rise to the famous wave-particle duality and the paradox that arises
when we classical beings ask: Is light a wave or a particle? The answer came in
the early 1920’s from Neils Bohr in
Copenhagen
in the form of the concept of complementarity that said: it depends on how you
look at it. If you do an experiment to measure wave properties, light will
manifest itself as a wave; if you do an experiment to measure particle
properties, light will manifest itself as photons.
Photons and waves are complementary aspects of a single reality we call light.
Some people object to quantum theory
on the grounds that it does not provide a “picture” of reality independent of
our measurements. I don’t see why complementarity is any threat to realism.
Experiment after experiment has demonstrated the amazing accuracy of quantum
mechanics. It seems reasonable to me to stick to what has been observed and
accept that when we observers interact with the “real” world we can sometimes
only see complementary aspects of reality. The most likely reason for this is
that
uncertainty is built into the very core of the
material world.
The Role of Uncertainty
If you observe the light given off by vaporized substances
through a prism you find that the colors of the light are different for
different substances. If the prism
views the light through a slit it will project an image of the slit as a series
of discrete lines dispersed according to the various colors making up the light.
This projection is called a line spectrum and this type of observation is called
spectroscopy. This was a very active field of study throughout the latter part
of the 19
th century. Analysis of these observations led to the
discovery of quite a bit of regularity in the spectra. So much so that Walter
Ritz developed in 1908 an empirical formula showing that the color (frequency)
of any spectral line could be expressed as the difference in two terms
characteristic of the emitting atoms. These terms involved discrete integer
numbers. J. J. Thomson had discovered the electron in 1897 and the current model
for the atom was like a watermelon with the negatively charged electrons located
like seeds inside the positively charged melon. For this model of the atom the empirical laws of spectroscopy
had absolutely no way of explaining the mechanism behind the spectral lines much
less the observed discreteness.
Henri Becquerel’s discovery of natural radioactivity in 1896
led to the understanding that atoms could transform: they were not the
indivisible elements of matter- they had structure. Ernest Rutherford began a
series of observations using one of the types of radiation emitted by natural
uranium (that he called alpha rays) as a probe to look for that structure. Analysis of those
observations led to the conclusion in 1911 that the alpha rays were being
scattered by a small compact positively charged mass within the atom. Rutherford
had discovered the nucleus. Rutherford's new model of the atom
consisted of the positively charged nucleus containing almost all of the mass of
the atom surrounded by a system of electrons held in place by the attractive
force of the nucleus. But this model was in serious conflict with classical
electrodynamics: no stable configuration for electrons at rest could be found,
and if the electrons were in orbit around the nucleus they would radiate away
their energy in a continuous fashion as they spiraled into the nucleus.
The correction to the model was supplied by Neils Bohr in 1913 when he suggested
that the electron motion around the nucleus was governed by the same kind of
quantum behavior that Planck had used to explain black body radiation. He
suggested that the electrons moved in stationary orbits each with a definite
energy W. And when the electrons make a transition from a higher energy state W
1
to a lower energy state W
2 the energy difference would be radiated away in the
form of a photon with a frequency f given by hf=W
1-W
2 where h is Planck’s
constant. He also set the scale of the orbit sizes using discrete integer
multiples of Planck’s constant and then calculated the energy of the electrons
in these orbits as just that required to balance the forces of electrostatic
attraction to the centripetal motion (a classical concept.) This is known as the
Bohr Atom and its triumph was that the energy levels that Bohr predicted
(calculated) for hydrogen agreed exactly with the spectroscopic observations.
This semi-classical model of the electrons orbiting the
nucleus in the same way as planets do around the sun is no longer the accepted
atomic model. In 1924 Louis DeBroglie hypothesized in his doctoral dissertation
that electrons had wave like properties just as photons do. This was confirmed
in the Davisson-Germer experiments in 1927. So, current atomic models consider
the electrons in orbits not as point particles but as spherical clouds of
probability with the most probable densities occurring at the position of shells
with discrete energies.
In 1925 Werner Heisenberg was working on the problem of
calculating the atomic energy levels. He was using a classical mathematical
formalism (Fourier analysis) that involves pairs of “conjugate” variables like
frequency (energy) and time to analyze the atomic spectra. He realized that
adapting this formalism to the quantum nature of the atom required that these
conjugate variables be
non-commuting.
This is a strange mathematical property that says that if you multiply one
variable by another, the answer depends on the order of multiplication. He
specified that the degree of “non-commutability” of the conjugate variables
required to agree with the observations was
a multiple of Planck’s constant [(Et-tE) = ih/2]. A year later he realized that any two
observables that do
not commute cannot be simultaneously measured. This realization introduced into
human thought the famous Heisenberg Uncertainty Principle.
This principle states that if an observer is making a
measurement of, for example, the radioactive decay of an excited nucleus
undergoing a chain of decays, the
precision of the measurement of the time at which a particular decay occurs (the decay
life-time),
∆T, results in a corresponding uncertainty in the measurement of the energy released in that decay,
∆E, and vice-versa. The principle states that the product of
∆T∆E can never be
smaller than a certain fraction of the Planck constant, h (h/4π). In the system of observer-measurement-nature, where does the
uncertainty come from? The usual answer is that it comes about because of the
disturbance in nature caused by the measurement. But that is obviously not the
case.
The uncertainty principle allows a particle of energy E
0 trapped behind an energy
barrier of height V (E
0 < V) to have a finite probability of
tunneling through that barrier if the energy fluctuations permitted by the
uncertainty principle within
ΔT,
the time required to traverse the barrier, are sufficient to overcome
the barrier. That is, the particle can tunnel through the barrier when
[E
0+h/(4πΔT)]≥V. This phenomenon,
known as quantum tunneling, is involved in the radioactive (alpha) decay of nuclei and
is responsible for the defeat of electrostatic repulsion that allows nuclear
fusion to take place in the stars. Except for tidal energy, all the energy on the
earth comes from radioactive decay in its interior (seismic activity,
continental drift, volcanoes, geothermal energy, etc.) or the radiation from solar fusion falling on its surface (weather, food, fossil fuels, etc.) So,
the uncertainty principle is not simply the result of measurement, it is
essential to the creation of the energy of the stars and of most of the energy on the earth.
Without the
operation of the uncertainty principle the sun would have never shone and we would not exist.
Zero-point Energy
But what about matter? Does uncertainty play a role in the creation of matter?
The current state of our knowledge tells us that there are four fundamental
forces at work in the material universe: gravity, responsible for organizing
matter into the large scale structure of the universe; electro-magnetism,
responsible for light and its interaction with matter at the quantum scale
(resulting in electricity and magnetism at macroscopic scales); the weak nuclear
force, responsible for radioactive decay and the transmutation of one form of
matter into another; and, the strong nuclear force, responsible for holding the
nucleus together and thus allowing for matter to be stable. As we have seen,
electromagnetic forces are carried by radiation (light) that manifests itself
either as waves in the underlying electromagnetic field or as quantized
particles of the field (photons). Because the photons have no mass the
electromagnetic fields extend throughout all of spacetime. Most of the
theoretical work done in support of the observations made in nuclear and
particle physics since the 1930's has involved extending these ideas to the
other force fields using techniques generally known as quantum field theory. We
will return to these other fields later but for now we concentrate on the
electromagnetic force fields that fill up spacetime. The theory of these fields
is known as QED (Quantum ElectroDynamics). The uncertainty principle
says that virtual particles having a complementary wave frequency f can exist
with a virtual energy E = 1/2hf at every point in the field. Since energy is
conserved in the field these virtual particles exist as particle-antiparticle
pairs so that the total energy remains zero at every point in the field. This
virtual energy is known as the zero-point energy and because it is summed over
all virtual particle frequencies allowable by QED it can be enormous. Can this enormous
virtual energy density make itself felt in the real world?
In 1947 Willis Lamb and Robert Retherford returned to the study of atomic energy
levels looking for fine structure in the levels using the recently developed
tools of microwave spectroscopy. They discovered a very small shift in the
energy levels of the hydogen atom that was unobservable with optical
spectroscopy and that had no explanation in the currently accepted versions of
quantum mechanics. Hans Bethe calculated the exact value of this shift (now
known as the Lamb shift) on the basis of a suggestion that the shift was due to
the interaction of the electron in the atom with the zero-point energy of the
electromagnetic field. This calculation provided a physical interpretation to
the process of "re-normalization" that is required to make QED a useful theory.
QED is the success story of 20
th century physics: it has been
confirmed by
experiment to accuracies in the 12
th decimal place.
The virtual zero-point
energy of the electromagnetic field connecting with the real world through the
Lamb shift showed theoretical physicists how to connect QED with the real world.
Recent beautifully precise measurements (Steve Lamoreaux, U. Mohideen, et. al.)
of the force between the metal surfaces of an uncharged
capacitor in a vacuum have proven that there is a force (the Casimir force)
that can be quantitatively
explained by the action of a quantum electromagnetic field produced by virtual particles
in the space between the surfaces. (There
is some controversy about this interpretation since the results can also be explained as
due to effects in the metal surfaces themselves.) Earlier measurements (R. Koch, et. al.) of the
electrical noise present in superconducting junctions revealed effects due to virtual particles
(zero-point energy) in electromagnetic fields. And there is the obvious fact
that helium remains liquid even at a temperature of absolute zero: even in the
absence of all molecular thermodynamic motion the helium molecules remain in the
liquid state because of the zero-point energy of the electromagnetic field.
(This can be suppressed by high mechanical pressure: helium becomes a solid near
absolute zero temperature at a pressure of about 25 atmospheres.) So, the answer
is yes:
the virtual zero-point energy of
the electromagnetic field that arises from the uncertainty principle has
observable effects in the real world.
Considerable speculation has arisen about zero-point energy. Stephen Hawking has
speculated that the fabric of spacetime is so distorted near the edge of a black
hole that one virtual particle anti-particle pair can be torn out of the
zero-point energy of the gravitational field emerging into the real world with positive energy while
its virtual particle-antiparticle partner emerges into the black hole with
negative energy. The net result is that, while the total energy is conserved,
the positive energy pair annilihates, forming what is called Hawking radiation,
and the negative energy pair reduces the mass (gravity) of the black hole
eventually causing it to evaporate altogether. Hawking radiation has not yet
been observed. There has even been speculation ( A. Rueda, B. Haisch, et.al)
that a massless charged body accelerating through the electromagnetic zero-point
energy field will acquire inertial mass, thus explaining Newton's second law.
They also speculate that a small portion of the zero-point energy density of the
gravitational field may be
gravitationally active, thus explaining the dark energy that is apparently
accelerating the expansion of the universe. The massless neutrino may acquire
mass as it accelerates through the zero-point energy of the weak nuclear force
field. These ideas are way out in front of observation but they do illustrate
that
zero-point energy may be involved in some of the most fundamental aspects
of the material world.
The Relativity of Space and Time.
Maxwell’s understanding that light is an electro-magnetic
wave moving at a fixed velocity, c, gave rise to notions of a “luminiferous
aether” that supported the light waves and provided a reference frame for the
value of c. He even suggested experiments to prove the aether’s existence.
Any number of experiments have been carried out looking for the aether and it
has never been observed. For Albert Einstein the most crucial of these
experiments was the one carried out in 1887 by A. A. Michaelson and E.W. Morley
working at what is now the Case Western Reserve University in Cleveland, Ohio.
Michaelson and Morley made very precise measurements of the speed of light in
their laboratory as the earth moved in different directions as it orbited the
sun. The precision of their measurement was good enough to see the effects of
the earth's motion of about 19 miles/second around the sun on the 186,000
miles/second speed of light. They saw no difference in the speed of light in
their laboratory as it moved at 19 miles/second in one direction and then in the
opposite direction 6 months later.
The observed fact, as
counter-intuitive as it might seem, is that
the velocity of light does not depend on the motion of its source.
Einstein took this observation at face value and combined it with another idea
that the laws and principles of physics should have the same form in all
reference systems moving at constant velocities with respect to each other
(inertial frames.) These two ideas form the basis for Einstein's Special Theory
of Relativity. It essentially says that if the speed of light, c, is the same in
all inertial frames and speed is distance divided by time then the rulers used
to measure distance and the clocks used to measure time must both be changing
with the velocities of the frames in such a way as to keep c constant and the
laws of physics the same. When
Einstein related the energy of a body to its momentum using these relativistic
forms of distance and time he came upon the equivalence of inertial (rest) mass and
latent energy
expressed in the famous equation E = mc
2. This behavior was predicted
in Einstein's 1905 paper and was experimentally verified in the Cockroft-Walton
experiments in 1932. Today, energy is routinely converted into mass in all of
the world's particle accelerators. Mass is routinely converted into energy in
today's nuclear reactors. It is at work as well in nature, of course, in
radioactive decay and the fusion reactions powering the stars. The equation E =
mc
2 has been verified time and again to amazing accuracy. The
relativistic corrections to the clocks moving in our satellites are essential to
the accuracy of our GPS systems.
Special Relativity is an observed fact. It
tells us that spacetime is not an absolute - it is
defined relative to material bodies.
Einstein called this the Special Theory of Relativity since it applied only to
the special case of collections of material bodies either at rest or moving at
constant velocities with respect to each other (inertial reference frames.) He
attempted to adapt this theory to cases where the reference frames were
undergoing non-uniform (accelerated) motion such as, for example, during rotation. This is called the General Theory of Relativity. One of the
outcomes of this adaptation was that the gravitational forces acting on material
bodies could be intrepeted as the effect of curvature in the spacetime that the
bodies are moving through. The curvature in spacetime is produced by the
presence of the bodies - the more massive the body, the greater the curvature.
The bodies curve spacetime and the curved spacetime determines how the bodies
move. The curvature of spacetime predicted by the General Theory of Relativity
has been experimentally verified with impressive accuracy, for example, in
measurements of its effects on the orbit of the planet Mercury and its effect on
the bending of light by celestial bodies (gravitational lensing.)
It is an observed fact that
spacetime is curved by the presence of material
bodies.
The Universe is Expanding
Unifying Gravity and the Quantum World
The Fine-Tuning of our Universe
(To be continued.)