We are now well into Day 13 of the search for missing millionaire-cum-adventurer Steve Fossett, whose plane disappeared off the radar -- literally -- somewhere over the vast expanse of the Nevada wilderness. A planned three-hour private excursion turned into a immense manhunt to rescue the missing man. Dozens of aircraft have scoured a whopping 20,000 square miles, with a small army of online volunteers poring over satellite imagery, courtesy of Google and Amazon, and yet Fossett and his single-engine plane remain missing. Searchers have found wreckage from seven prior unrecorded crash sites, however -- a testament to just how difficult it can be to locate small planes in that region, a task made even more difficult by (a) an apparent malfunction in the plane's transponder/emergency locater, and (b) Fossett's failure to file a flight plan. (In fairness, it was supposed to be a very short, routine flight, and Fossett is/was a highly experienced pilot who has survived plenty of harrowing crash landings in the past.)
I don't know if the airborne searchers are employing just conventional aerial and satellite imaging, or whether they're also making use of more innovative techniques, notably Light Detection and Ranging (LIDAR), but I hope they'll at least consider doing so, just to improve the odds that much more. LIDAR is an optical remote sensing technology that exploits the same basic principle as radar and sonar -- sending out pulses that bounce off objects and analyzing the returning signals to determine an object's distance from the source -- except it uses light wave pulses instead of radio waves. (Yes, I know, radio waves are technically just another form of electromagnetic radiation; in this instance, "light" refers to the visible and near-infrared frequencies in the spectrum.) It's not so much a replacement technology, as a complementary one -- just one more tool in our growing arsenal for remote sensing and mapping, and for finding things like Fossett's downed plane.
A LIDAR instrument transmits pulses of light to a target, and the parts of the spectra that are not absorbed by the target are reflected back (known as backscatter) to the system, which then are detected, stored and analyzed. It's the changes in the properties of the light when it scatters back that enable scientists to measure specific properties of the target. Bats use ultrasonic pulses to hunt for their prey, emitting a series of pulses that become more frequent the closer it gets to its target, climaxing in a kind of "feeding buzz" as it locks in for the kill. Similarly, the more frequent the light pulses emitted in a LIDAR system, the more information is gathered, and the more accurately a target area can be mapped. For airborne topographical mapping, as many as 33,000 laser pulses can be transmitted every second.
LIDAR has been around for quite a long time, having been invented shortly after the first lasers appeared in 1958. But the technology was a bit ahead of its time, and languished for several decades until a whole bunch of other enabling technologies emerged. Early lasers were too expensive, frankly, and too heavy, too big, and required too much power, to make them practical for airborne applications. When the solid-state diode pumped laser emerged, that changed: they were cheap, rugged, and compact, with comparably low power requirements. Computing technology also needed to advance to the point where it was fast enough, and cheap enough, to perform the kind of advanced data analysis required by a LIDAR system.
Most notably, early LIDAR systems could make accurate measurements in the centimeter range, but only for lasers fixed on the ground. This strictly limited its useful deployment, since once anyone placed a laser on a moving platform, all bets were off. Then came the Global Positioning System (GPS), and suddenly it became perfectly feasible to figure out exactly where a moving object might be in relation to a ground-based coordinate system. And LIDAR was finally dusted off and brought into the marketplace.
For remote sensing applications, the LIDAR system is mounted onto an aircraft equipped with a GPS receiver (de rigeur these days in just about any vehicle) to track its exact location and altitude. It also needs a high-accuracy inertial measurement unit (IMU) to track the pitch and roll of the airplane so that movement can be accounted for in the final analysis. Basic physics tells us that objects spinning at a very high rate tend to maintain their relative position in space, so an IMU incorporates several spinning gyroscopes. By measuring the angle of tilt as each spins a spherical mass within a gimbal (or cage), and coupling that with an accelerometer to keep track of shifts in velocity, the system can tell us how far, how fast and in what direction a target is moving relative to a given starting point. Usually, all the data collected from the various instruments, when combined, can give an elevation that is accurate to within 6 inches.
Generally speaking, we can usually only image a feature or object roughly the same size as the wavelength of the EM radiation being used, or larger. Radio waves used in typical radar systems are great at detecting things like metallic objects -- which is why they're so useful for military and aviation applications -- but rocks or raindrops might not produce much in the way of detectable reflections at all, making them well nigh invisible to radar. But because the wavelengths used are much shorter than radio waves, LIDAR systems are much better at detecting very small objects, like particles in the atmosphere. In fact, they are already used to study atmospheric conditions, notably the densities of various particles, not to mention all kinds of emerging applications in geology, seismology, and even archeology. Also, lasers use a very narrow beam, so LIDAR allows for mapping of physical features with much higher resolution than conventional radar. It can have a "footprint" of less than 1 meter, making it possible to map the floor underneath a forest canopy, or the urban canyons between tall buildings.
There's more than one kind of LIDAR system, each suited to a specific kind of application. If you just want to measure the distance from your instrument to a solid target, range-finder LIDAR should suit your purposes just fine. If it's a moving target, and you want to figure out how fast it's moving, you'll probably want to use Doppler LIDAR, which -- as its name clearly implies -- relies on the Doppler shift effect to determine an object's velocity. If you're a meteorologist interested in measuring the specific concentrations of chemicals and such in the atmosphere -- ozone, water vapor, and pollutants -- or if you want to map a shallow river bed underwater, you're better off using differential absorption LIDAR. Underwater imaging in particular can be difficult using infrared and near-infrared preferred for terrestrial mapping, since water absorbs those wavelengths; only the blue-green end of the visible spectrum can penetrate water, for the most part.
Small wonder so many applications have emerged in the past decade for LIDAR systems. In geology and seismology, they're used to detect faults -- most famously, to locate the fault in Seattle, Washington -- and to measure "uplift" at Mount St. Helen's in Oregon. (I assume this refers to plumes of ashes that the mountain occasionally burps out from time to time, which is an indication of whether its internal distress is reaching a critical eruption point.) Airborne LIDAR is used to monitor glacial melting and other coastal changes, while in forestry, LIDAR is used to study canopy heights and measure biomass, not to mention making the surveying process that much faster. On a more mundane scale, there are now handheld LIDAR systems for traffic enforcement. Whereas radar relies on Doppler shifts to directly measure speed, LIDAR actually calculates the speed, making it easier to isolate particular vehicles from a heavy stream of traffic. Remember that next time you're tempted to weave in and out of lanes at high speeds.
LIDAR has been used for search and rescue before, most notably in the wake of the terrorist attacks of September 11, 2001. For several days after the World Trade Center fell, a small plane made several passes over Ground Zero in Manhattan (and also over the damaged Pentagon in Washington, DC) taking LIDAR readings of the debris -- courtesy of a company called EarthData, which used the collected data to produce topographical images of the sites. This in turn helped rescue workers navigate the often-treacherous terrain by identifying unstable areas likely to shift or collapse. The maps also enabled building and utility workers to locate foundation-support structures, elevator shafts, basement storage areas, and so forth. As workers moved deeper into the WTC's basement wreckage, LIDAR mapping showed where the integrity of the underground walls might have been compromised, thus making those areas more at risk of flooding. The maps were even able to measure the volume of the debris and how much reach the cranes would need to efficiently remove it.
In the realm of atmospheric studies, LIDAR systems could bring some relief to frequent fliers who have spent way too many wasted hours in airports with chronic delays (*cough* O'Hare *cough*) because our flights were grounded. Scientists at RL Associates in Pennsylvania presented a paper at the recent meeting of the Optical Society of America on their new near-infrared LIDAR system, now in its prototype phase and slated for testing in about 18 months. According to Mary Ludwig, one of the leading scientists on the project, the system will provide better images in foggy, rainy, or extremely hazy conditions, thereby making it a whole lot easier for pilots to take off an land in those conditions, as well as giving more advance warning of potentially hazardous atmospheric conditions, such as icing. This in turn could help reduce weather-related flight delays .
There are other experimental LIDAR systems for similar purposes, using visible green light to detect different kinds of particles in the atmosphere, but most commercial planes aren't equipped with such systems. Ludwig's team has developed a system that uses a polarized laser light beam, i.e., its electric field points in a specific direction. The system beams this polarized infrared light outwards, and then records the amount of polarization that returns to the sensors. The data is then processed to form an image of the ground. If need it be, it can also be translated into verbal commands. This is especially useful for atmospheric applications because ice crystals, for example, are highly depolarizing, while mere water droplets are not. Hard, manmade objects will depolarize the returning signal less than natural objects, so it's possible to get enhanced images of targets in a obscured or camouflaged environment -- say a small, crashed aircraft hidden in dense underbrush. Ergo, Ludwig et al.'s system is better at detecting different types of atmospheric particles, such as ice, supercooled liquid, or water vapor, plus it can also tell the different between water vapor and other substances (metal, or the human body, for example).
The system also incorporates something called a "range-gated detector" that is only turned on for very short periods of time, rather than continuously -- namely, whenever the return signal is expected. The detector is turned off when the laser pulse is first emitted from the system, so it doesn't pick up all the near-field backscatter from things that might be in the pulse's path to the target -- usually the major source of noise for such a system. This means there's a lot less noise in the resulting data so you get better images, particularly in foggy or hazy conditions. All of which spells welcome news for frequent airline travelers.
Perhaps the niftiest new application of LIDAR is in archeology. In England, Cambridge University is collaborating with the UK Environment Agency in the use of LIDAR imaging to produce terrain maps for large swathes of the countryside. It started out as a way of assessing flood risks, but then an organization called English Heritage contracted with the EA to conduct a LIDAR survey of Stonehenge -- one of the most studied landscapes in all of Europe, and a certified World Heritage Site. See the nifty LIDAR image of Stonehenge below, courtesy of English Heritage? How cool is that? It turns out that LIDAR is terrific for recording terrestrial features that have been leveled by many years of ploughing: the WHS survey revealed several previously unrecorded banks in and around the Stonehenge site.
LIDAR mapping isn't just about mapping positions and elevations anymore (although that's definitely the primary focus); newer systems seek to integrate other aspects of feature recognition to make map production ever more automated. A few years ago, scientists from Sweden and Italy teamed up to use LIDAR to image the various types of stone used in the construction of Lund Cathedral. Located in Sweden, the cathedral is an impressive 12-century edifice that ranks as the largest Romanesque building in northern Europe. (Whether you're impressed probably depends on your fondness for the Romanesque period.) Not only could they "see" the differences between the stone used, but they could also tell, from a distance, which of the walls had moss and lichen growing on them.
As intensity recording is incorporated into LIDAR systems, scientists should be able to improve even further on this type of analysis. Intensity recording not only measures the distance between the LIDAR and the target, but it can determine the features of a landscape based on the strength of returning signals. That's because every reflective surface will absorb some wavelengths and reflect others. A concrete block, for instance, reflects almost every wavelength and absorbs very little, so the returning signal is very strong. Leafy vegetation, however, absorbs quite a bit more of the light, and hence returns a weaker signal. These data can also be turned into a visual image.
So light and lasers are an increasingly important tool in archaeological mapping. And the innovations just keep coming. An article a couple of months ago in the San Francisco Chronicle profiled a retired civil engineer named Ben Kacyra, who invented "a camera-like device that uses lasers to scan three-dimensional objects -- such as archaeological ruins -- to create digital blueprints accurate to within a few millimeters." (You can see many such images taken with the instrument here.) At the heart of his device is a laser that emits light with sufficient power to bounce off a distant object and return to a sensor, capable of timing the intervals between signal and response. In this way, "the laser maps the surface of objects by taking millions of measurements at different angles." Sounds like a LIDAR system to me.
Kacyra's system -- which he sold to Swiss company Leica Geosystems in 2001 -- has been used to study pre-Incan Peruvian ruins, the buried Roman city of Pompeii, and the cliff dwellings of the extinct Anasazi people in southwest Colorado. And once these digital blueprints are created and stored, it becomes so much easier to recreate portions of those sites and edifices in a virtual framework -- perhaps, one day, in Second Life. Kacyra has already created a small reproduction of an ancient frieze that he scanned, using his mapping tool. As the Chronicle article put it, "Think of the archaeological equivalent of a reprint of a famous painting, a chance to hold a piece of history." Indeed. There lies the future of LIDAR.
I hadn't thought to look for Fossett in my backyard yet. All the others are there.
Hmmmm....
Posted by: Joan of Argghh! | September 16, 2007 at 06:53 PM
Woops---Really enjoyed your blog on LIDAR, however, Mt St Helens that blew up some years ago is in Gifford Pinchot National Forest in Washington, not Oregon. But then California people would not know this.
Posted by: PLO | September 18, 2007 at 10:35 PM
I think no discussion of modern and future LIDAR systems would be really complete without a mention of the LIDAR satellite, CALIPSO. See http://www-calipso.larc.nasa.gov for the satellite, and the data it has taken. If you're in the right place at the right time (see its posted satellite track), you can see its beautiful bright green flash (from space)!
Posted by: Ellipsis | September 18, 2007 at 10:50 PM
Mt St. Helen's uplift is the effect of hot magma moving into the subterranean plumbing. Two effects combine: The hot magma is less dense than the surrounding rocks (that's why it's moving UP) and produces a bouyancy effect; and as the magma sits in the conduits it came up through it heats and expands the colder rock around it. Surface effects, such as plumes of ash, are other effect of magma movement but unrelated to uplift. I would guess they occur when gas-rich magma moving along a small crack reaches close enough to the surface. The lowered pressure allows the gas to come out of solution and push the magma out.
Posted by: verisimilidude | September 26, 2007 at 01:57 PM
What LIDAR mapping did in the WTC's basement is a concrete example of how useful it is. Imagine how the maps were even able to measure the volume of the debris and how much reach the cranes would need to efficiently remove it. Although LIDAR mapping's main focus is about mapping positions and elevations, we should be ready to see more automated production. This we should see.
LMExpert
http://lidarservices.com/lidarmapping.html
Posted by: Link Mapping Expert | October 26, 2009 at 01:40 PM