Audiences flocked to to the futuristic thriller Minority Report when it debuted in 2002, and while I generally find Tom Cruise almost unwatchable as an actor (it's a subjective thing, lots of people love him), I thought this was some of his better work -- the other highlight for me being his performance in the ensemble cast of Magnolia. In the case of Minority Report, I was impressed not just with the film noir mystery, but also the visually stunning -- and completely plausible -- futuristic world depicted onscreen. So naturally I was on hand at the Hammer Museum this past Thursday night (April 22) to hear a talk called "Beautiful Tools" by artist/scientist John Underkoffler of Oblong Industries -- part of a series of lectures sponsored by 5D on the future of immersive design.
The 5D conference "explores the profound impact of the convergence of art and science across all narrative media: film, game, animation and architecture," and the organization sponsors an ongoing lecture series to particularly explore the new emerging relationship "between artists and scientists, designers and engineers, and the pervasive effect of this new collaboration not only on design and science process but as a fundamental change in the relationship between artifact and audience." Underkoffler (who is an advisory board member of the Science & Entertainment Exchange) consulted on Minority Report, and drew on some of his own groundbreaking research at MIT while doing so. (He's also consulted on The Hulk, Aeon Flux, Stranger Than Fiction, and Iron Man.) And it so happens that in addition to being brilliant at his work, he is also an incredibly engaging speaker, with a strong sense of narrative -- as indeed, one might expect.
He opened with a still of the very first Apple computer introduced in 1979, at a time when the "computer" was all about creativity, or "making stuff." You had to program in everything; there were no handy operating systems or cute little icons, no mouse, no "drag and drop" feature, and so forth. Fast forward to 2010, and we have ingenious handled devices like the iPod and the iPad -- which are impressive pieces of industrial design but are essentially about media consumption. Underkoffler thinks it was the Web that changed things: "stuff is no longer just on your computer, it's distributed, on a server, within cloud computing."
He prefers to view the machine as an extension of the human: "a gift, an act of generosity, and an aesthetics of agency, if you will." His vision was to build "a distinct ecosystem in which devices work as digital exoskeletons: amplifiers for human creative intent... to aid in making, in building, in design." All of which sounds really intriguing, but how does one go about achieving that? If you're Underkoffler, and you're at MIT, you start with a simple light bulb. It's a useful medium with no particular "message," but turn on a light and it enables us to work in the dark, for instance. Underkoffler decided to take this one-way illumination and transform it into a two-way conduit of information. (I could not help but think of the changes the Web has wrought in information dissemination, where it is no longer top-down dissemination but a two-way dialogue between writer and audience.) Per this Website:
The I/O Bulb is based on a traditional light bulb but is able to not only project light, but also collect live video of the objects and surfaces it projects light onto. The Luminous Room may then be used as an optics simulator for a variety of purposes; Underkoffler sees the I/O Bulb as having applications for urban planning and architectural modeling, where planners and designers would be able to observe light patterns and reflections resulting from various arrangements of structures and buildings.
While working on his PhD, Underkoffler also designed a gestural interface system (now known as G-Speak), which allows users to navigate and interact with data by interpreting a user's motion so the user can move through datasets with no need for a computer mouse or any other physical object to do so. Somehow Hollywood art director Alex McDowell (also an Exchange board member) heard about it, and asked Underkoffler to consult on Minority Report to help director Steven Spielberh create a believable world 50 years into the future: ecobuilding, targeted advertising, Maglev transportation and so forth. Not everything made it into the final film -- I am personally sad that there were no mediabots, essentially robotic paparazzi serving as extensions for Fox News, MSNBC and CNN. Citing the "primacy" of display technology, Underkoffler devised clothing based displays (although a fashion runway scene was cut from the final print), a webbed finger display, and what Underkoffler termed a "data taco" that fits over one's wrist.
But the highlight was how the filmmakers decided to use Underkoffler's work on the gestural intrerface system to build a forensic analysis display. They needed a cool new technology to help Cruise's character sift through all the images collected from the "pre-cogs" and match them to information on file withinf their massive database, all showcased on a gigantic curved display, using their hands to "conduct" the information -- with no voice technology, no keyboards, and no mice. Drawing on his prior work, Underkoffler literally invented a sign language for the film drawing on bona fide sign language for the deaf, SWAT signals, and musical systems, among other sources and synthesized into a new language. And I'll bet you thought Cruise et al were just randomly waving their hands around on the screen. Far from it! They had to study the "thesaurus" of gestures and practice them using training videos -- because it had to look "real," not something they'd just been reading about. When they put it all together -- Lights! Camera! Action! (and a soothing classical soundtrack) -- it looked like this:
Breathtaking, isn't it? Almost like poetry in motion. With G-Speak, "Physicality comes back into computing space," said Underkoffler. AND NOW IT'S REAL! "We built a real program with real language and trained real people to use it -- even though it was fictional." Oblong has since built a prototype G-Speak system: a new machine that can be used to design rather than just tedious tasks like file management, with an operating system that is not predicated on the use of a mouse. Even better, more than one user can collaborate in the virtual space at a time: it's no longer just "one user, one screen." Check out this video of Underkoffler and colleagues working their magic with the real-world G-Speak:
My reaction can be summed up in a single word: WANT! (Or, as Jen-Luc Piquant would say, Je veux!) Seriously, I drove home after the lecture scheming about how to abuse my position as director of the Exchange to convince Underkoffler to install a G-Speak system in our home. Because think of all the cool esoteric physics the Spousal Unit could accomplish with that thing! Also? Video games! Film mashups! And much, much more! G-Speak could definitely be an iPad killer, people -- just as soon as it's sufficiently commercialized that regular folks (or at least middle class folks) can afford them. "The tools' form ensures that acts of construction are often indistinguishable from acts of exhibition. Inherently, working means performing, whether anyone is watching or not," per Underkoffler. ""And underlying it all is a digital architecture that acknowledges space -- the real-world geometry that structures the rest of existence -- for the first time."
I'm fond of saying that ideally, the Hollywood/science interaction should benefit both the entertainment and research communities, and Underkoffler's work is a prime example of that. His research informed Minority Report, which in turn inspired him to develop his rudimentary system into a viable real-world prototype. And I have no doubt whatever he's working on now will end up informing another film at some point in the future.