I Finally Get Quantum Physics!

Matt Williams
9 min readSep 18, 2022
Getty Images

Not long ago, I achieved a personal milestone by learning how to grasp (and, more importantly, explain) Einstein’s Theory of Relativity. This goal was invaluable to my work as a science communicator because it happens to be a cornerstone upon which our understanding of the Universe is based. And Einstein himself is believed to have said, “If you can’t explain it to a six-year-old, you don’t understand it yourself.”

I took those words and applied them to the theory that is still being validated over a century later and made Einstein a household name. It took a few years and a LOT of investigating, but I finally got to the point where it all clicked in my head, and I felt like I got it. I then tested it out in a series of articles designed to see if I could relate it to others in a step-by-step fashion, and the feedback I got indicated success.

Well… as they say, ROUND TWO! You see, in addition to Relativity, there’s another all-important field that is foundational to our scientific understanding. And that is none other than Quantum Physics! You see, Einstein’s Theory of General Relativity (GR), his full and complete interpretation of the theory, which he finalized in 1916, explains how the Universe behaves on the macro scale — i.e., as a four-dimensional reality where gravity alters the curvature of spacetime.

Quantum Physics, a field that was emerging in Einstein’s time (and which he played a role in establishing, examines the Universe on the smallest of scale — in terms of subatomic particles and the fundamental forces of nature. This particular revolution in the sciences began in the 19th century with the discovery of electromagnetic phenomena and breakthroughs in atomic modelling.

Atoms to Quanta

The notion that all matter is made up of tiny, indivisible particles can be traced back to Classical Antiquity, with philosophers from the Near East, Central Asia, India, and Greece offering their own accounts of the possibility. However, these theories were largely philosophical or metaphysical in nature and not grounded in empirical physics. By the early 19th century, the study of matter at the smallest level became the subject of scientific investigation.

It was during this time that English chemist and physicist John Dalton proposed his theory of “atoms” (after the Greek word atomos for “indivisible”). This theory was an attempt to explain the Conservation of Mass and the law of definite proportions — two fundamental laws of chemistry. In short, Dalton claims that all matter was made of tiny atoms, a “solid, massy, hard, impenetrable, movable particle.”

By the end of the 19th century, scientists discovered that Dalton’s atoms were not the fundamental particles of nature and were, in fact, composed of even smaller particles. This included the electron (discovered independently between 1879 and 1897 by multiple scientists), which was theorized to be the “negatively charge” constituent of atoms. This implied that there was a positive particle that was yet to be discovered.

Subsequent experiments in the early 20th century revealed that atoms are mostly made up of empty space (the “Gold Foil Experiment”), giving rise to new models that showed the atom to be composed of a nucleus and orbiting electrons. This led to the discovery of the positive particle (the proton), that atomic mass was equal to atomic charge, and the discovery of the neutral particle (the neutron).

Other experiments showed that atoms emit charged particles (radiation) as a result of decay or exposure to an electromagnetic source (known as the “photoelectric effect.”) This became part of a growing debate about the behavior of light (see the Double Slit Experiment below). In 1900, famed German physicist Max Planck proposed a revolutionary explanation for this, claiming that energy is not released in continuous waves but in discreet packets known as quanta.

In 1905, Einstein built on this by proposing that these emissions were made up of localized, discreet wave-packets called. That same year, he proposed his Special Theory of Relativity, which (among other things) demonstrated that mass and energy are different expressions of the same phenomenon (aka. mass-energy equivalence). The relationship between a photon’s energy and frequency, and its mass and frequency, were described by Planck in a set of equations (the Planck Constant).

As the 20th century progressed, scientists continued to advance the study of subatomic particles by probing the interiors of atoms, leading to profound (and sometimes frightening) discoveries!

Particle or Wave?

By the 1920s, the “photon” (from the Greek phôs, for “light”) was identified as the unit of light — a massless, chargeless particle that behaved as both a particle and a wave. But even before the photon was discovered, scientists had been locked in a debate about the nature of light. Much like its measured velocity (which confounded the notion of Galilean Relativity), there were unresolved question about its behavior.

For millennia, scientists conducted experiments where they would shine a light toward a wall and place a barrier with an aperture in between. They would then observe the reflection on the far wall, which always appeared as an interference pattern. For example, if the aperture was circular, the pattern would look like concentric circles that got fainter farther away from the center. If the aperture were a vertical slit, the interference pattern would look like vertical lines that also got fainter farther from the center.

Then you have the Double Slit Experiment, which was first performed by British scientist Thomas Young in 1802. In this version, it involved shining a light through two vertical slits, and the results were consistent. On the far wall, Young noted two strong lines with fainter lines on either side of them, which overlapped with each other in between. This experiment truly demonstrated how light behaves as a wave.

However, when scientists attempted the same experiment with electrons, things got weird! These experiments began in the 1920s and involved an electron laser and a photodetector to record the pattern. When observed with the naked eye, the results were the same as when the experiment was conducted with a light source (an interference pattern). But when scientists attempted to observe the electrons passing through the slits using an electron microscope, the results were entirely different.

This time around, the resulting pattern was two vertical slits. In other words, the electrons were behaving like particles, not waves! When the camera was removed, the interference pattern returned. But when the scientists examined the photodetectors up close, they noted that the line appeared to be composed of particle-like impacts. This introduced a staple of quantum physics, which is that subatomic particles can behave as a particle and a wave (aka. particle-wave duality).

The difference comes down to observation, where the presence of an observer causes the waveform to collapse to a single particle. Erwin Schrodinger summed this up nicely with his thought experiment — Schrodinger’s Cat — which he proposed in 1935. According to Schrodinger, a cat is in a box with a vial of poison. The only way to know if the cat is alive is to open the box. Until that happens, the cat exists in a superposition of states where it is both alive and dead. By opening the box, the observer collapses this wave function and creates a single outcome.

Uncertainty & Spooky Action!

Then you have Heisenberg’s Uncertainty Principle, which was put forth by Werner Heisenberg in 1927. Based on ongoing experiments with subatomic particles, Heisenberg asserted that there was a fundamental limit to the accuracy with which one could predict their physical qualities (position and momentum) based on initial conditions. In essence, if one knew the precise momentum of a particle, it would be impossible to know its precise position (and vice versa).

The same holds true for energy and time in that one cannot measure the precise energy in a system in a finite amount of time (and vice versa). This leads to conjugate pairs, position and momentum, energy and time. This contradicted classical Newtonian physics, which held that every property in a system could be measured with certainty given enough time and the right means. While this all seemed counter-intuitive, such things were fast becoming the field of quantum physics!

Even more confounding was “quantum entanglement,” an observed phenomenon where the physical properties of particles (e.g., position, momentum, spin, and polarization) were shared, despite being separated by vast distances. In these cases, the quantum state of each particle in an entangled group could not be described independently of the others, and measuring the state of one causes the wave function of an entire group to collapse.

Event Horizon Telescope

This not only violated the principle of locality, where objects are influenced by their immediate surroundings, and the fact that information cannot be transmitted faster than the speed of light (Special Relativity). These issues were first described by Albert Einstein, Boris Podolsky, and Nathan Rosen in a joint paper published in 1935, where they argued that:

“A sufficient condition for the reality of a physical quantity is the possibility of predicting it with certainty, without disturbing the system. In quantum mechanics in the case of two physical quantities described by non-commuting operators, the knowledge of one precludes the knowledge of the other.”

This left only possibilities: (1) the description of reality given by the wave function in quantum mechanics is not complete, or (2) these two quantities cannot have simultaneous reality. However, the fact that scientists could not make accurate predictions concerning a quantum system based on measurements of a previously interacting system meant that if (1) was false, then (2) was also false. This came to be known as the Einstein-Podolsky-Rosen (EPR) Paradox.

The violation of locality was especially disquieting for Einstein, who described the idea of entanglement as “spooky action at a distance.” Suffice it to say, he and other scientists (like Schrodinger) didn’t like the prospects of the emerging field of science (quantum mechanics) they helped create. In 1935, Schrodinger provided the name for this phenomenon and summarized it beautifully:

“I would not call [entanglement] one but rather the characteristic trait of quantum mechanics, the one that enforces its entire departure from classical lines of thought.”

How Does it All Work?

The field of quantum mechanics gave rise to several proposed resolutions that attempted to reconcile how multiple possibilities could exist simultaneously with the idea of a knowable Universe. One proposal is known as Multiverse Theory, which states that the waveform of a particle is essentially a range of possibilities that represents the multidimensional nature of the Universe. While the presence of an observer causes the waveform to collapse and create a single outcome, every other outcome has taken place in another Universe.

CERN

While a great deal of uncertainty and “spooky action” remains at the heart of quantum physics, decades of subatomic research have led to a clearer picture of how it all fits together. In short, scientists have confirmed that particle physics comes down to three fundamental forces (and their corresponding particles). They are:

  1. Electromagnetism: Responsible for interactions between electrically-charged particles and electric/magnetic fields (mediated by photons)
  2. Weak Nuclear: Responsible for interactions between the atomic nucleus and orbiting electrons and various forms of particle decay (mediated by bosons)
  3. Strong Nuclear: Responsible for nuclear binding, the force that holds atomic nuclei together (mediated by gluons)

These three forces make up the Standard Model of Particle Physics, which was finalized in 2012 with the discovery of the Higgs Boson. Alas, there’s one more fundamental force that quantum mechanics cannot account for. Gravity! This force can only be explained in terms of Einstein’s Theory of General Relativity. As a result, physicists are accustomed to characterizing the Universe in terms of two systems, which describe how things work on the smallest of scales (Quantum Mechanics) and the largest (Relativity).

Whereas one deals with the subatomic particles that underly all reality and the laws of nature, the other deals with planets, stars, galaxies, superclusters, and the cosmos as a whole. Finding how these two systems (and four fundamental forces) interact is the next major step for physicists, who are now turning their attention to exploring beyond the Standard Model in the hopes of finding a Theory of Everything (ToE).

Several ideas have been proposed, which include String Theory, Loop Quantum Gravity (LQG), and other theories that collectively fall under the heading of “quantum gravity” (QG). These have attempted to explain gravity in terms of a yet-undiscovered particle (the “graviton”), as a consequence of vibrational strings, extra dimensions, or the nature of spacetime itself.

Once again, this is how I have come to understand this all-important field of science, and I could be wrong on several (or all points). More to the point, even if I have described Quantum Mechanics faithfully, that matters very little if I can’t explain it to people who have no background in the field in a way that is relatable and accessible. Only you can be the judge of that, so let me know in the comments. How did I do?

Matt Williams

Space/astronomy journalist for Universe Today, SF author, and all around family man!