There was exciting news from the Lawrence Berkeley National Laboratory last week, as researchers announced that they had performed the world's smallest double-slit experiment and determined that quantum (subatomic) particles will start behaving in accordance with classical (macroscale) physics at the size scale of a single hydrogen molecule. Quantum physicists are no doubt excitedly discussing these marvelous results with a passion most people reserve for Super Bowl Sunday. But the average reader's eyes probably just glaze over with incomprehension, leaving him/her to wonder what all the fuss is about. Truthfully? It's tough to grasp the significance of this latest quantum wrinkle without a bit of background about Thomas Young's original 1802 experiment (now the poster child of the quantum concept of particle/wave duality), as well as the historical scientific debate that raged around the nature of light. Hence today's Monster Post.
Particle or wave? That was the question. It proved to be an especially contentious issue; the debate raged for millennia, in fact. Pythagoras, in 5th century BC Greece, was staunchly "pro-particle," while Aristotle (who lived a couple hundred years later) was ridiculed by contemporaries for daring to suggest that light travels as a wave. The confusion was understandable, because empirical observations of the behavior of light contradicted each other. On the one hand, light traveled in a straight line and would bounce off a reflective surface. That's how particles behave. But it also could diffuse outward, and different beams of light could cross paths and mix together. That's undeniably wave-like behavior. In short, light had a split-personality disorder.
By the 17th century, many scientists had generally accepted the wave nature of light, but there were still holdouts in the research community -- among them no less a luminary than Sir Isaac Newton, who argued vehemently that light was comprised of streams of particles that he dubbed "corpuscles." In 1672, colleagues persuaded Newton to publish his conclusions about the corpuscular nature of light in the Royal Society's Philosophical Transactions. He seemed to assume that his ideas would be greeted with unanimous cheers, and was rather put out when Robert Hooke and the Dutch physicist Christian Huygens were reluctant to jump on the Isaac Bandwagon. The result was an acrimonious, four-year debate. Huygens differed with Newton on such key points as how the speed of light changes as light goes from a a less dense medium like air to a denser material like glass: Newton said it should increase; Huygens said it should decrease. The issue remained largely untested because at the time there was no good way to measure the changes in speed.
Ultimately, Newton's stature as one of the greatest physicists of all time ensured that his notion of streams of corpuscles won out over the wave theory of light -- until that cheeky over-achieving upstart, Thomas Young, appeared on the scene almost a century later. Young was the oldest of 10 children born to a Quaker family in Somerset, England, and proved to be alarmingly precocious. He could read by the age of 2, learned Latin by age 6, and by the time he was 14, he'd added Greek, French, Italian, Hebrew, Chaldean, Syriac, Samaritan, Arabic, Persian, Turkish, and Amharic to his linguistic repertoire. His facility with languages served him well later in life, when he became fascinated with Egyptian hieroglyphics and played a key role in cracking the code of the Rosetta Stone by deciphering several Egyptian cartouches.
Young first studied medicine at Cambridge, then earned a physics doctorate in Gottingen before setting up shop as a physician in London. By age 28, he'd been appointed a professor of natural philosophy at the Royal Institution, delivering lectures about his experiments in everything from optics, acoustics, climate, and the nature of heat, to electricity, hydrodynamics, astronomy, gravitation, and measurement techniques. The term "polymath" hardly seems to do him justice; his fellow students at Cambridge used to call him "Phenomenon Young." No wonder his epitaph at Westminster Abbey salutes him as "...a man alike eminent in almost every department of human learning."
Ah, but could this brilliant young phenomenon take on The Goliath of Physics and win? Young was actually a huge fan of Newton and based his early work on color and vision on the insights Newton gleaned from his experimentum crucis. But that didn't mean he accepted the Great Man's conclusions without question. His pivotal experiment didn't start out as the poster child for the quantum concept of wave/particle duality; like every other scientist of his day, the fact that light might be both was simply inconceivable to Young. So he designed an experiment he believed would determine the matter once and for all.
Naturally, a darkened room was involved, along with a light source (probably a candle, or sunlight, this being the early 19th century). Young shone the light onto a barrier in which he'd cut two narrow, parallel slits, about a fraction of an inch apart. On the other side was a white screen. He reasoned that if light were made of particles, as Newton claimed, the screen would show two bright parallel lines where the light particles had passed through one slit or the other. But if light were a wave, it would pass through both slits, separating into secondary waves that would then recombine on the other side -- i.e, they interfere with each other.
It's a bit like water waves, which have crests and troughs. As the secondary light waves recombine on the other side, wherever two crests or troughs line up exactly, they produce a bright spot of light. Wherever a crest and a trough line up exactly, they cancel each other out, leaving a dark spot on the screen. The resulting "interference pattern" is thus a series of alternating dark and light bands. And that's exactly what Young observed, even making his own sketch of the interference pattern. Light, per his experiment, was undeniably a wave.
Young was understandably pretty chuffed at the success of this experiment, which offered the strongest evidence to date in favor of the wave theory of light. He applied his findings to explain the shifting colors found in thin films, such as soap bubbles, and even tied the seven colors of Newton's rainbow to wavelength, calculating what each color's approximate wavelength would have to be to produce that particular color of light.
Alas, his euphoria was short-lived: the pro-Newton crowd lost no time in bashing Young's experimental findings. One simply didn't question the Great One, even 80 years after Newton's death. Online encyclopedist David Darling memorably described it as "the scientific equivalent of hari-kiri." Young was too, well, young to known better. Newton's place in the pantheon ensured that the scientific community largely ignored Young's pivotal experiment for a good 10 years, bolstered by a simply savage review of his work in the Edinburgh Review (published anonymously in 1803, later revealed to have been authored by one Lord Henry Brougham, a big-time Isaac acolyte.)
Fortunately for the wave-friendly fans of light, French physicist Augustin Fresnel conducted a series of more comprehensive demonstrations of Young's basic experimental setup, succeeding (where Young had failed) in convincing the world's scientists hat light really was a series of waves, rather than streams of tiny particles. And in the mid-19th century, another Frenchman, Leon Foucault, proved that Huygens had been correct -- and Newton mistaken -- in his assertion that light travels more slowing in water than in air. Given the acrimony Huygens experienced from Newton for sticking to his guns on this issue, one would understand if the Dutch scientist indulged in a little "Nyah, nyah, nyah" type of gloating from beyond the grave. (It probably helped that the French were a bit less worshipful of Newton than the Brits. Jen-Luc Piquant urges us to remember that even the greatest scientists are often wrong. Huygens, in fact, was partially responsible for advancing the notion that light waves travel via an invisible substance called the luminescent ether, later disproved by the famed Michaelson-Morley experiment in the 1890s.)
There were a bunch of other breakthroughs going on at the same time, of course, and taken together, everything added up to strong support for the "light is a wave" school of thought. Case closed. Or so physicists thought as the 19th century drew to a close. But light had a few more surprises in store for them with the birth of quantum mechanics. It's too long a story to go into here, but Max Planck, Albert Einstein, and Arthur Compton were among the luminaries whose work led to the realization that light was both particle and wave: specifically, light is made of photons that collectively behave as a wave.
Sounds simple enough. Except quantum mechanics is never that simple. The revolution didn't end there. Quantum theory predicted that even the individual photon could behave like a wave, and essentially interfere with itself. For a long time, there was no way to test this prediction. But eventually technology and scientific instrumentation advanced to the point where they could emit and detect single photons. The modern version of the experiment looks like this. First, we need a researcher -- let's say, Paris Hilton, just to stretch your powers of imagination a little. Paris sets up a simple light source in front of a barrier with two small slits cut into it, with a light-sensitive screen on the other side to record the pattern of incoming light. Paris turns on the light source and is hypnotized by the shiny beams sends a series of photons, one photon at a time, toward the two slits in the barrier.
We're talking about single particles here, so the photons should only be able to go through one slit or the other, and just strike the screen like so many tiny ping-pong balls. Instead, Paris is stunned to find that the light forms that telltale interference pattern -- alternating bands of dark and light -- on the screen on the other side. What the heck? This means that those single photons are behaving like waves; each photon somehow travels through both slits and interferes with itself on the other side.
Now Paris wants to know more. This is a woman who reads Sun Tzu, after all; her natural curiosity drives her to repeat the experiment with an extra twist: she places particle detectors by each of the slits, so that she can verify that the photons do, in fact, each go through both slits at the same time. Except this time, she doesn't get the interference pattern; she gets the ping-pong ball effect, which means that the photon is now behaving like a single particle, passing through one slit, but the other. Are the photons just messing with her? Unable to cope with the quantum conundrum, Paris Hilton's head explodes. Millions rejoice. Tabloids mourn. And those mischievous photons give an evil cackle of delight at having claimed another victim.
The good news is, the photons aren't deliberately messing with our heads. There is an explanation for the two results, but it's an explanation that defies common sense. Instead of merely tweaking her first experiment, thanks to the addition of the particle detectors, Paris unwittingly performed a completely different experiment the second time around. In the first version, she's making a wave measurement; in the second, she's making a particle measurement. The kind of measurement she chooses to make determines the outcome of the experiment. Basically, if Paris just lets the photons travel from the light source to the screen undisturbed, they behave like waves and she sees the interference pattern. But if she observes them en route, she knows which path the photons took; this knowledge forces them to behave like particles, passing through one slit or the other. Paris can construct her experiment to produce an interference pattern, or to determine which way the single photons went. But she can't do both at the same time. Heisenberg's Uncertainty Principle rears its ugly head.
Hence the opening line of the UC-Berkeley press release: "The big world of classical physics mostly seems sensible: waves are waves and particles are particles, and the moon rises whether anyone watches or not. The tiny quantum world is different: particles are waves (and vice versa), and quantum systems remain in state of multiple possibilities until they are measured -- which amounts to an intrusion by an observer [Paris Hilton!] from the big world -- and forced to choose: the exact position or momentum of an electron, say."
There's a lot of really big ideas contained in those two sentences, more than we can even attempt to discuss intelligently in a single blog post. Vast tomes have been written about this, countless papers are published each year in academic journals -- including the one describing the latest version of the double-slit experiment that the Berkeley Lab group performed. (Actually, LBL collaborated with scientists at the University of Frankfurt in Germany, Kansas State University and Auburn University.) We were most impressed with the sheer ingenuity of how they constructed their experimental set-up. They used the two proton nuclei of a hydrogen molecule as the two "slits," separated by a mere ten-billionths of a meter.
The tricky part is to separate the component parts of the hydrogen molecules in the first place. How the heck did they manage that? It helps if you have access to a couple of x-ray beam lines at LBL's Advanced Light Source. All you need to do (are you taking notes, Paris?) is send a stream of hydrogen gas through the apparatus into an "interaction region" (the equivalent of an enclosed chamber, would be my guess), where some of the hydrogen molecules run afoul of that nasty x-ray beam, which has sufficient energy to knock off each hydrogen molecule's two negatively charged electrons. Without that negative charge to balance things out, the two positively charged protons that form the nucleus of the molecule blow apart from the powerful mutual repulsion. The LBL researchers then used an electric field to separate the particles according to charge, sending the protons to one detector and the electrons to a detector in the opposite direction. Genius!
LBL researcher Ali Belkacem calls this "a kinematically complete experiment," because it accounts for every single particle, enabling them to figure out all kinds of things, like "the momentum of the particles, the initial orientation and distance between the protons, and the momentum of the electrons."It's not just photons that exhibit wave/particle duality: electrons do, too. So even a single electron is capable of interfering with itself. Just like the classical version of the experiment, the scientists could study the electrons as particles, or as waves. For instance, they found that once the electrons were knocked off the hydrogen molecule, one was fast, and one was slow, giving them an assortment of both fast and slow electrons.
Mostly they were interested in the interference pattern, particularly at what point it disappeared. They essentially turned the slower electrons into teensy particle detectors by boosting their energy levels just a tad. For reasons that remain unclear to me -- I invite any quantum physicists to offer their explanation in the comments -- this turns the slow electrons into "observers." They are "big" enough to interact with the classical domain. So the interference pattern disappears and the electrons behave almost like a classical system. I say "almost," because apparently they still retain some signs of entanglement (what Einstein called "spooky action at a distance"). [UPDATE: Chad at Uncertain Principles goes into more of the technical details behind this new experiment, while the mysterious Statistical Deviations blogger offers a possible explanation of how boosting a slow electron's energy slightly makes it "big" enough to act as an "observer."]
So there you have it: the world's smallest double-slit experiment. And now we must go rest our poor aching head, perhaps by watching a couple of sitcoms or reading about Paris' latest tabloid exploits (aiding drunken elephants in Africa? I think not). She'd get into far less trouble if she'd just stick with her quantum physics research. Then again, when's the last time Larry King bothered to interview a quantum physicist?
J. O.:
Thanks for the post! It immediatley put me in mind of Feynmans' book "QED..." which never fails to raise the hair on the back of my neck when I read it. (Usually a couple of times a year, just because it's so much fun!) I enjoy the blog daily, (as you make new entries), please keep up the great work.
Posted by: M. L. Green | November 13, 2007 at 10:32 PM
Great post, thanks for writing it! I don't really find anything geeky about learning about the universe, though. It's fascinating.
Posted by: Allen | November 14, 2007 at 01:05 AM
I enjoy your blog, and particularly this post, but I'm compelled to ask: how can two things be "about a fraction of an inch apart?" Can they be exactly a fraction of an inch apart? :)
Posted by: Tom | November 14, 2007 at 12:28 PM
re: why faster electrons become "classical" particles...
-The faster the electron (i.e., the more momentum it has) the shorter its (de Broglie) wavelength.
-The shorter its wavelength, the smaller the scale of the interference pattern.
-The smaller the scale of its interference pattern, the more particle-like (i.e., "classically") it behaves.
This explains why they had to speed up the slow electrons rather than just use the fast ones: the fast ones had so much kinetic energy that they were already acting classically.
The separation of the two nuclei in a hydrogen molecule will be on the order of 1 Angstrom; as the wavelength of the electron gets down to this value (by speeding up), the interference pattern is stretched to the point that the minima disappear. Which is to say, the electron appears to travel straight through the "slit" without diffraction, as a particle would. For reference, an electron has a wavelength of 15 A when traveling 1% of the speed of light (which may sound fast, but consider that the electron beam in an old-fashioned CRT television has about half of that speed.)
(N.B. I am not a quantum physicist, but as I have a doctorate in astronomy I sometimes pretend.)
Posted by: Bemopolis | November 14, 2007 at 02:22 PM
Stupid question? I thought hydrogen, by definition, only had one proton.
Posted by: Lauren | November 14, 2007 at 08:58 PM
That's not a stupid question at all, Lauren. We're actually talking about a simple hydrogen MOLECULE in this instance, not an atom. A hydrogen ATOM, by definition, has only one proton and one electron (although the isotopes can have varying numbers of neutrons).
And thanks to Bemopolis for summing up the answer to my own query so succinctly and clearly.
M.L.: QED is my fave Feynman book, too...
Posted by: JenLucPiquant | November 14, 2007 at 09:15 PM
Now QM is pretty bizarrely different from the expectations we import from the world of medium sized objects but if we didn't insist on explaining it in such an inappropriate and paradoxical fashion it would be a lot less mind boggling. Some of the problems with the standard explanation (you just did a good job presenting it so I don't mean to criticize you) are the following.
1) Electrons/Photons/etc.. are neither particles nor waves not both. (and the wave/particle duality is just a philosophically boring statement of approximate behavior).
Electrons/photons are not both waves and particles they are wave functions. Wave functions are just their own thing that behaves as QM says they do. The fact that these wave function objects tend to have different limiting behavior in different conditions doesn't show any deep duality in their nature.
2) Photons/Electons/etc.. are not in two places at once and it's misleading to talk about them going through both slits at once.
When a water wave goes around both sides of a rock or enters a harbor through two slits we don't describe this as the water wave being in two places at once. We understand that really the water wave is an extended object that can (non-paradoxically) occupy an entire region of space at one time. Saying that the water wave goes through the left entrance and that the water wave goes through the right entrance too is technically valid but misleading since it suggests that the wave goes entierly through each entrance.
Now it's an interesting fact about wave functions that they act very very differently (well if you accept a many minds interp our observations of them are very different) if you place a detector on the slits but so what? This doesn't justify suggesting this is in anyway analogous to a normal medium sized object spookily being in two places (which I didn't notice you doing but is common).
3) The position/momentum of elementary particles are not uncertain. These properties just don't apply to wavefunctions.
Asking for the position of a photon except when it is localized by collapsing into a position basis is just as wrong as asking what's the 'real' position of a water wave. A wavefunction is usually an extended object that doesn't have a precise position similarly with momentum. In fact the EPR paradox shows that these properties simply don't apply except in the rare case when the wavefunction is totally collapsed into that basis.
Anyway sorry for the rant. I liked your post but I just have a pet peeve about the whole standard analogy for QM.
Posted by: Peter Gerdes | November 15, 2007 at 06:10 AM
As any electrical engineer would assure you, it's all about Fourier transforms. Taking the piss out of Paris Hilton is another matter entirely. She is a goddess and I worship her - my distinction in Part III of the Cambridge Mathematical Tripos notwithstanding.
Posted by: jongleur | November 15, 2007 at 09:44 AM
Everyone has their pet peeves about common explanations of QM, including my own Spousal Unit. Feelings can run very high on the issue. And every peever is convinced that his/her approach is so vastly superior that any non-scientist encountering it will be immediately enlightened into a depth of understanding hitherto unachieved. They are usually mistaken. :) The reason the popular explanation is so popular is because the average non-scientist is not going to grasp the abstract concept of a wave function. So we give them a simpler, more concrete way of visualizing it; this gives them a rudimentary grasp of what's going on, preparing them for the next stage. Eventually, they WILL be able to grasp the finer nuances you outlined above (although I can think of lots of folks who'd nitpick with some of YOUR statements). But for a blog called Cocktail Party Physics, we'll stick with our take, thanks very much.
That said, I did have a section in an earlier draft of the post drawing out the anaology of water waves as you described, so I'm glad you brought it up. I cut it because the post was getting too damn long and I'm trying to write just a tad more succinctly. But one of the reasons for writing the blog in the first place is to fine-tune things like my approach to explaining wave/particle duality, and I think I can work a couple of your points into the next version without sacrificing that lay-level clarity and simplicity I'm always shooting for. So thanks!
As for poor Jongleur, this week's episode of HOUSE dramatized his predicament very nicely: The mighty, sarcastic House hires a beautiful woman as a team member because her looks blind him to the fact that she's just not as sharp about the medicine as the rest of his team. She makes inane diagnostic suggestions and he thinks they make sense because, hey, she happened to be wearing a see-through blouse that day. (Non sequitur: do you have any idea how tough it is to find a photo of Paris where she's NOT scantily clad and in some overt sexual post?) Like House, Jongleur is ascribing qualities of perfection to the image in his mind called "Paris Hilton," which most likely bears little relation to the actual woman. S'okay, it's kinda cute. And frankly, women do this, too, with their objects of infatuation. Infatuation makes everyone temporarily stupid. I say "poor Jongleur," because he can wave about his intellectual prowess in front of Ms. Hilton all he wants -- she's stll not going to go out with him. Not unless he's also the heir to a small fortune, or looks like Adonis.
Posted by: Jennifer Ouellette | November 15, 2007 at 11:32 AM
I had seen a video demonstrating this experiment on Stumpleupon and have been fascinated ever since.
It is nice to find someone who writes as well as you about what most people might consider boring. Thanks for taking the time to share your knowledge with us.
Posted by: FitnessGeek | November 15, 2007 at 06:36 PM
Those who've read QED by Feynman have enjoyed his description of how complicated a thing is actually taking place in the instance of "simple" reflection of an image in a mirror. Imagine what he'd do with Whatsernames see-through blouse?
Posted by: M. L. Green | November 15, 2007 at 08:07 PM
Nice post, Jennifer! Your explanation is perfectly suitable for this layperson. Of course, further elaborations are appreciated too. : )
OT, about your reference to the last House episode in your comment -- I missed the majority of the show (wah!). Did House decide to fire her, or did he give her another chance?
Posted by: Athena | November 16, 2007 at 11:06 AM
"Then again, when's the last time Larry King bothered to interview a quantum physicist?"
Hmmm, maybe not Larry King, but Charlie Rose interviewed Lisa Randall, a particle
phenomenologist....
Posted by: Gordon | November 16, 2007 at 08:51 PM
logicnazi wrote:
3) The position/momentum of elementary particles are not uncertain. These properties just don't apply to wavefunctions.
OK then. How, please to reconcile what you just said with Heisenbergs Uncertainty Principle. You can't have it both ways, you know. Wavefunctions may be the greatest thing since buttered toast, but.......
please explain how they solve the problem of position/momentum rather than saying they simply don't apply.
Evidences please.
Thanks
M. L.
Posted by: M. L. Green | November 17, 2007 at 01:48 AM
Hi Jennifer,
Is nothing about physics or something like that. I'm just came here to say that a used this Paris Hilton picture in my blog. You can see, my twisted version of this amazing picture here: http://www.ocabulosodestino.net/extreme-makeover/
The text are in Portuguese, but I guess you will easily understand!
Kisses,
Posted by: Israel Barros | November 19, 2007 at 08:01 PM
Ah yes, the photon. A little bit of space, a pinch of time and some energy. How does it manage to cross the universe remaining always in the present? Why can't we sense the present in the present of an event?
Posted by: mhgintn | November 24, 2007 at 02:14 AM
I am just an unschooled retired truck driver, so I ask your indulgence if my comments are out of line.
Mr. Einstein said that all things are relative, then tried to explain the behavior of all things with rigid rules. What if the laws of action are relative to the environment? It seems to me that QM is trying to explain the observed quantum behavior using the rules established for our relative size and speed in the universe.
If you could extrapolate the rules, taking into consideration size, speed, temperature, and so forth, I believe you could eliminate the many paradoxes and conundrums that abound in the science of both large and small, fast and slow. After all there are no true paradoxes, just flawed logic.
If I'm in the wrong ballpark, please have mercy. I just happened to Stumble in here.
Posted by: driver | November 24, 2007 at 05:44 PM
According to Ben Wien, there is no debate over wave or particle. Photons are wavicles, explained in his "encyclopedia": http://www.benwiens.com/encyclopedia.html
Posted by: eingram | March 06, 2008 at 04:28 AM
Someone (everyone?) missed the question above about how can two somethongs be a fraction of any measuring unit apart. I.E. can they be 1/7" apart? I think this is just an abstract question, but it would be interesting to hear some learned folks explain it. Thanks.
Posted by: eingram | March 06, 2008 at 08:30 PM
QUANTUM MELODY
Below subatomic, the particles
slip through Heisenberg’s uncertainty nets.
They cannot even be called articles;
they’re just mathematical epithets.
Though we may say they have up or down spins
(we may even find them charming or strange),
like angels that dance on the heads of pins,
it takes metaphysics to find their range.
They have no shape we can define, except
as bleary fields of energy. Until
we measure them, there’s no place where they’re kept;
their locus is totally vibratile.
They pluck at space like an instrument string,
at this scale. Quark! The hadron angels sing!
Posted by: James Ph. Kotsybar | March 15, 2008 at 04:07 PM