Stanford physicists are apparently marching to the beat of a nanoscale drummer these days, according to a paper in the February 8 issue of Science. Yeah, it's a bad pun, but Jen-Luc Piquant couldn't resist. And I couldn't resist writing about research that combines acoustics, scanning tunneling microscopes (STMs), and resonance at the quantum/nanoscale -- particularly in light of a special session at the upcoming APS March Meeting in New Orleans celebrating 25 years of the STM, now a workhorse technology in all kinds of scientific fields, even beyond physics.
But first: quantum drums! The Stanford experiment arose out of an interesting acoustical question: do drums of different shapes always produce unique sound spectra (in terms of the properties of the acoustical wave)? It would be great if they did, because then it might be possible to develop an acoustical version of spectroscopy -- another workhorse physics-based technology that, say, analyzes the various elements that make up a distant star by studying the spectrum of light associated with that star. (It can also be useful for more terrestrial experiments, such as determining chemical composition of substances. They're always performing spectroscopic analysis in the lab on C.S.I.)
Sometime in the 1990s, alas, mathematicians proffered definitive proof that two differently shaped drums could produce the exact same sound, thereby dashing hopes of what I will call, for lack of a better term, "acoustical spectroscopy." This makes it impossible to work backwards from the sound spectrum to derive information about the physical properties of the drum that made that sound, because it does not have a truly unique signature -- rare, perhaps, but not truly unique. Spectroscopy works because there's only one answer to the question, "What is this stuff?"
Some people might deem this a failure, and relegate the topic to the dustbin. But this is science, people, where even null results can yield useful insights. Such was the case for Stanford physicist Hari Manoharan, who saw not a failure, but an opportunity, in part because, as he said in the official press release, "This revolutionized our conception of the fundamental connections between shape and sound." And it could even be relevant to spectroscopy, "because it introduced an ambiguity." As systems get smaller and smaller, and move into the nanoscale realm, quantum effects hold sway, and that tiny degree of ambiguity -- unimportant in the classical world -- suddenly could have a significant effect on, say, nano-electronic systems of the future.
So Manoharan and his Stanford colleagues brought the problem down into the quantum realm, building tiny nanoscale "quantum drums" out of carbon monoxide molecules on a copper surface. They constructed "walls" only one molecule high and then shaped them into nine-sided enclosures capable of "resonating" like drums. (Apparently this ability is related to particle/wave duality, but specific details about this aspect were hard to come by during my weekend Web surfing.) About 100 carbon monoxide molecules make up the exterior "drums" and inside are around 30 electrons.
And just like macroscale drums, these nanoscale versions of different shapes nevertheless could resonate in the same way. This is called isospectrality. You can see nifty pictures, video, and listen to cool sound samples here, although of course, the sounds have been converted into audible range for humans. In reality, the "sounds" are at frequencies far too high for humans (or even dogs) to hear.
By now, you might be figuring that this just means science failed twice at creating acoustical spectroscopy, both at the classical and quantum levels. And okay, that may be the case. For spectroscopy. But it turns out there is some practical value for being able to build two differently shaped nanostructures that nonetheless have identical properties, particularly as computer chip circuits continue to shrink into the nanoscale. Chip designers would have more than one way to get the same result, giving them extra flexibility, or, as Manoharan phrased it, "Now your design palette is twice as big."
That could turn out to be significant to the design of future quantum computers (assuming quantum computers ever become a reality). Based on their findings, Manoharan's team has also figured out a way to determine the quantum phase of the wave functions of the electrons inside the quantum drums, without directly observing them. The process is called quantum transplantation, and it involves taking measurements from two quantum drums and then mathematically combining that information, thereby enabling scientists to "cheat" the usual limitations of quantum mechanics "and obtain normally obscured quantum-mechanical phase information," according to Manoharan.
Manoharan's work builds on decades of scientific advancement in a wide variety of fields, but one of the most critical is the development of STMs. That's what enables nanoscientists to move around individual molecules on a substrate, for example -- you need a level of precision and control and imaging resolution that just can't be achieved using ordinary microscopy. That's because there are some fundamental physical limits to looking at tiny objects with light, as demonstrated in 1872 by the physicist Ernest Abbe. Basically, you can't see details smaller than the wavelength of the light you're using.
Creating a visual image of an object under a conventional light microscope requires light waves to pass through that object, which are diffracted into what I guess you could call interference patterns. We get our information about the object from those patterns. Ergo, the more diffracted light waves actually reach the instrument's objective lens, the better resolution you get. ("Resolution" describes the distance between two details of an image that can just be distinguished by the viewer. For a conventional microscope using visible light, the best resolution is on the order of 4000 angstroms, and Abbe's calculations limited the magnification factor to 1000 or so.) Diffraction is that thing that happens when light passes through the spaces in an object -- think of how water waves pass through spaces between reeds along the shore of a lake. If a gap is smaller than the wavelength of light, then waves can't pass through the gap. There's no diffraction, and that's bad for resolution.
What was needed was a form of energy with a shorter wavelength of light -- say, an electron. But an electron is a particle! You might say. True, but in 1924, Louis de Broglie proved that not just light, but also electrons -- indeed, all particles -- exhibit the same particle/wave duality. So an electron can also behave like a particle or a wave, which means an electron beam should, in principle, be useful as a means of "seeing" smaller objects than light waves. Enter Swedish physicist Ernst Ruska, the son of a professor of the history of science, who grew up fascinated by his father's instruments, particularly a large microscope. The young Ernst went on to study electrical engineering, and while still a student at Berlin Technical University in the late 1920s, applied de Broglie's equations to the notion of an electron microscope, concluding that it should be able to see smaller objects than light. He also figured out that he a magnetic coil could serve as a "lens" for electrons, and irradiating an object with an electron beam could produce a useful image. In 1933, he built the very first electron microscope, and subsequently helped commercialize the technology, which pretty much revolutionized science. That's why Ruska was honored in 1986 with the Nobel Prize in Physics, at the ripe old age of 80.
Ruska shared the prize with Gerd Binnig and Heinrich Rohrer, co-inventors of the first scanning tunneling microscope while both were employed by IBM's research group in Zurich, Switzerland, in 1981. It's not entirely accurate, scientifically, to call this technique a form of microscopy, since we're not really looking at the sample via light at all anymore. We're feeling the surface with a mechanical probe, much like a blind person reads Braille by touch. An STM is far more sensitive than a fingertip.
It's a pretty basic set-up: you have a stylus with a very sharp tip mounted on a flexible cantilever. As the tip moves across a surface -- without actually touching it, mind you -- there are going to be interaction forces between the tip and the surface, and these are going to affect the cantilever's movement in turn. Those movements can be detected by piezeoelectric sensors, and turned into images. It is the sharpness of the stylus and how well it can traverse the structure of the sample's surface that determines the resulting image resolution.
It should probably be noted right about now that STMs are just one kind of scanning probe imaging technique; there's an entire family of such techniques, including atomic force microscopy (AFM) which measures the interaction force between the tip and the surface. (The STM, in contrast, measures a weak electrical current flowing between the tip and the sample as they are held a very short distance apart.) There's pros and cons for every type of technique, and for the various "modes" in which they operate, which is why the "family" is still growing. But they're all based on the same basic principle.
Ironically, even though it's meant to commemorate the history of the STM, the March Meeting session will focus on some of the latest innovations in research using STMs. For instance, Sergei Sheiko of the University of North Carolina at Chapel Hill will talk about his work using scanning probe microscopy to image flexible polymer molecules whose sizes are beyond the limits of standard optical resolution. He's been able to get very high resolution of the molecular structure of those polymers, and has used AFM to study them as they move and react on surfaces. Sheiko's work could lead to better control over surface-activated changes in coatings, lubrication, catalysis, and biochemical assays, by revealing how those changes impact molecular structure and properties. And that's just the tip of the iceberg, or the cantilever, in this instance. It's a solid bet that further innovations in STM and related technologies will yield even more insight and unprecedented atomic-level control over the next 25 years.