How quantum makes a departure from classical mechanics.
Our Quantum Journey
By the turn of the twentieth century, scientists were pleased with themselves. After centuries of progress by the likes of Galileo and Isaac Newton, there was a strong belief that the most fundamental principles of nature were finally understood.
For example, the motion of all sorts of bodies from falling apples to the planets orbiting our Sun had been – fairly accurately – described by the laws of ‘classical physics’. Yet their confidence was short-lived, as it soon transpired that not all physical phenomena could be explained within this simplistic framework. Brand new physics and ways of thinking were the only way forward.
In this introductory tile, we will cover the most problematic phenomenon, which ultimately sparked the ‘Quantum Revolution’ in thinking as the true nature of light was laid bare. But before getting there, let us set the scene by exploring how scientists viewed the world up until everything changed.
Newton and the Color Spectrum
In the seventeenth century, there was widespread debate about the fundamental nature of light. In the 1660s, Isaac Newton obsessively studied it by conducting experiments with glass prisms and sunlight.
In the process, he scientifically established the ‘visible spectrum’ of a rainbow, demonstrating that everyday white light from the Sun is composed of seven distinct colors. The light dispersed due to ‘refraction’, the redirection of a wave as it passes from one medium to another.
Light travels more slowly in denser materials, so the part of the wave which first hits the glass is slowed before the rest of the wave, which is still travelling at ‘normal’ speed through the air. This caused Newton’s light to bend or ‘refract’. Not that Newton understood this at the time, but since different colors of light have differing wavelengths, they travel at different speeds in glass and are refracted at ever so slightly different angles. The result? A dazzling spectrum!
The Corpuscular Theory of Light
It’s everywhere and immeasurably important to life, so what exactly is light? Newton contemplated this and theorized in 1704 that light is a spray of tiny particles of negligible mass called ‘corpuscles’, each moving in a straight line.
His corpuscular theory of light attempted to explain light’s well-established properties of reflection and refraction. So why does light reflect, according to this theory? Using his trusty corpuscles, Newton equated the reflection of light to an elastic ball bouncing off a hard surface.
What about refraction? In Newton’s view, as corpuscles approach a refracting surface they become increasingly attracted to it and change direction, gaining speed as they enter the denser medium. Crucially, Newton’s theory was very wrong.
For starters, light actually moves slower in a denser medium. But that didn’t stop it from eclipsing the – slightly more correct – wave theory of light as the prevailing view of his time, in no small part due to his staggering reputation.
The Wave Theory of Light
In the fierce debate over the fundamental nature of light, Newton was an ardent advocate for it being particle-like. But his Dutch contemporary, Christian Huygens, had put forward an entirely different idea in 1678.
To Huygens, light was instead a wave-like disturbance in a mysterious, weightless, and all-encompassing substance known as the ‘aether’. It became very fashionable for scientists to believe that light propagated through and was mediated by this invisible aether.
In fact, the concept of an aether lasted until at least the late 1800s and consumed a significant amount of resources along the way as every search for its existence proved fruitless.
Huygens believed that this aether vibrated in the same direction as light waves, forming a wave itself as it carried the light onwards. He also incorrectly suggested that light was a ‘longitudinal wave’ just like sound waves. That is, a wave whose vibrations are back-and-forth along the direction of travel.
Huygens’ Principle
In 1678, Huygens proposed a model of light where each point on a wavefront becomes a new source of spherical ‘wavelets’ expanding in every direction. These secondary wavelets then combine or ‘interfere’ with each other to determine the form of the overall wavefront at any later time.
If the peak of one wavelet meets the lowest point or ‘trough’ of another, they cancel each other out in an example of ‘destructive interference’. Similarly, two peaks or troughs would sum up to form a larger wave via ‘constructive interference’.
This is ‘Huygens’ Principle’, and using it he was able to convincingly account for the laws of reflection and refraction, while also correctly predicting that light travels more slowly in a denser medium. Huygens’ model was critical in establishing a credible wave view of light rather than a particle one, but it still left a lot to be desired as a comprehensive theory…
Young’s Double-slit Experiment
Newton’s corpuscular theory triumphed despite the success of Huygens’ Principle in explaining observations in a far easier-to-visualise way. It wasn’t until 1801 that English physicist Thomas Young conducted his famous ‘double-slit experiment’, which ultimately led to the general acceptance of light as a wave. In his experiment, he shined light through two narrow, closely spaced vertical slits and observed the ‘interference pattern’ on the wall beyond.
All waves spread out and undergo ‘diffraction’ when they encounter an obstacle or opening. However, the opening must be comparable in size to the ‘wavelength’ in question for this effect to be observed.
Since light has an extremely small wavelength, Young’s narrow slits worked like a charm in demonstrating the wave nature of light. Light coming through each slit diffracted and subsequently interfered to create a distinctive pattern of alternating bright and dark lines. Without the characteristic wave properties of diffraction and interference, the light would simply leave two lines on the screen.
What is Electromagnetic Radiation?
In the 1860s, light was understood as a form of ‘electromagnetic radiation’ occupying a small window of possible wavelengths in a far broader ‘electromagnetic spectrum’. Visible light sits between UV rays, which have a shorter wavelength, and infrared rays which have a longer wavelength. Other well-known examples of this radiation are X-rays, microwaves, and radio waves.
In classical physics, electromagnetic waves are the result of coupled magnetic and electric fields oscillating perpendicularly to each other as well as the wave’s direction of travel. That is, they are ‘transverse’ waves, and not longitudinal as previously thought!
It was esteemed Scottish physicist James Clerk Maxwell who championed this ground-breaking work, which he used to demonstrate that electric and magnetic phenomena are merely different manifestations of the same fundamental force of the universe known as ‘electromagnetism’. He also proved in his famous ‘Maxwell’s Equations’, that all electromagnetic waves travel through a vacuum at the ‘speed of light’ – 3 × 10^8 metres per second.
What is a Black Body?
All objects or ‘bodies’ both emit and absorb infrared rays, an invisible form of electromagnetic radiation which we feel as heat. The hotter a body, the more infrared radiation it emits. Yet no known object in our universe can perfectly emit or absorb all radiation of every possible frequency across the electromagnetic spectrum. Some objects such as stars do, however, come approximately close to this definition and are therefore referred to as ‘black bodies’.
A black body is a hypothetical object which absorbs all electromagnetic radiation incident upon it, explaining why the name came about as every color of light is absorbed. A black body doesn’t just absorb, it emits too. This ‘black-body radiation’ – which has a very distinctive pattern when plotted as an ‘emission spectrum’ – is something we can experimentally detect from distant stars. Unfortunately, this pattern didn’t fit with and couldn’t be adequately described by the laws of classical physics…
Black-body Radiation Curves
We can plot the approximate black-body radiation we observe from stars as a ‘black-body radiation curve’. These curves – which are temperature-dependent – tend to plot wavelength or frequency against a quantity called ‘spectral radiance’, which can be simply thought of as a measure of the amount or intensity of radiation emitted within each window of the electromagnetic spectrum. The hotter a black body is, the steeper and further to the left towards shorter wavelengths its peak, but the distinctive profile of the curve remains the same.
Interestingly, we can use these curves to deduce the temperature of a distant star using an equation known as ‘Wien’s Displacement Law’ which links the wavelength at the curve’s peak to the body’s temperature. What makes these curves so fiendish however is the fact that classical physics presented no way to explain them or derive them mathematically! It would require a completely novel set of ideas to set the record straight.
The Ultraviolet Catastrophe
Still firmly in the classical era, the ‘Rayleigh-Jeans Law’ was conceived in the early twentieth century as an attempt to mathematically describe black-body radiation curves and match experimental findings.
Trouble is, this law agreed with experimental results at large wavelengths or low frequencies, but disagreed dramatically at short wavelengths or high frequencies. In fact, it was so hilariously wrong that it suggested that the energy output of an idealized black body would shoot up to infinity at wavelengths close to the ultraviolet region of the electromagnetic spectrum!
The ludicrous implications of this law are why the event was retrospectively dubbed the ‘ultraviolet catastrophe’. The reason for this huge error? The equation underpinning the law was derived through classical arguments and assumptions only. With this inescapable problem, the classical foundations began to crack and fall apart. If Newton and all the others were wrong, then what’s right? It was time for Max Planck to take a “quantum leap”.
Planck’s Quantum Explanation
The “father of quantum physics”, German physicist Max Planck came to the rescue in 1900 by figuring out how to correctly describe the behavior of black bodies mathematically, across all wavelengths. His outlandish concept? He assumed that electromagnetic radiation could only be absorbed or emitted in discrete packets – or ‘quanta’ – of energy.
He reasoned that the energy released by a black body originated from oscillations of its atoms, and that these oscillations – like those of a violin string – would have certain harmonic ‘modes’. Radiation is released when an atom switches between modes of oscillation, and so tends to have predictable wavelengths. At any given temperature, very small or large changes would be less common than those in the middle of the range, explaining why the distribution of emitted wavelengths forms the shape of the black-body radiation curve. Planck’s idea set the stage for the impending revolution, but it would require Albert Einstein to provide the next big step in ‘quantization’.