Jen-Luc Piquant sez: "They like us! They really like us!"
"Explains physics to the layperson and specialist alike with abundant historical and cultural references."
-- Exploratorium ("10 Cool Sites")
"... polished and humorous..."
-- Physics World
"Takes 1 part pop culture, 1 part science, and mixes vigorously with a shakerful of passion."
-- Typepad (Featured Blog)
"In this elegantly written blog, stories about science and technology come to life as effortlessly as everyday chatter about politics, celebrities, and vacations."
-- Fast Company ("The Top 10 Websites You've Never Heard Of")
We are reminded that we have been remiss of late in passing on nifty book recommendations. So let me heartily recommend the just-released Lab Coats in Hollywood: Science, Scientists, and Cinema by David A. Kirby, who was a practicing evolutionary geneticist before he switched to the Dark Side, leaving bench science to become Lecturer in Science Communication Studies at the University of Manchester in the UK. I met David through my association with the NAS Science & Entertainment Exchange -- mostly because so many of his publications address the relationship between cinema, genetics, and biotechnology.
Full disclosure: I wrote a blurb for Kirby's book, but yanno, I'm a genuine fan. I don't blurb books I don't like. And honestly, there really are no other books right now quite like David's. My blurb: “There have been many books written on the intersection of science and Hollywood. But David Kirby’s excellent tome is the first to examine seriously the role of the science consultant in the movie-making process and assess its potential impact. Lab Coats in Hollywood is essential reading for anyone who shares Kirby’s passion for bringing science into the service of storytelling for the silver screen.”
Of course, you don't have to just take my word for it. Take it from one of my fellow blurbers, film and TV writer/producer Zack Stentz (Thor, X-Men: First Class, Fringe, Terminator: The Sarah Connor Chronicles):
“In the gap between science fact and science fiction stands the motion picture and television science consultant. In this brisk, lively account, David Kirby provides us with a history of these often unheralded scientific ambassadors to Hollywood and the critical role they play in shaping how film and television makers depict science--depictions which in turn shape how science is understood by the public at large.”
If you're a scientist interested in consulting for film and TV, you should read David's book. And in honor of its release, I offer my own humble tips, gleaned from two years with the Exchange. First and foremost:
(1) Manage Your Expectations. Shhh! Keep this under your hat, but Hollywood isn’t nearly as glamorous as you think. I know, you think it’s all just one long episode of Entourage (the colorfully foul-mouthed Ari Gold character is, indeed, based on a real-life agent, Ari Emmanuel). But power lunches and club-hopping are what people do in between projects, and even then, it’s mostly agents and studio execs with expense accounts -- or A-List stars -- who can afford that. Once a film or TV show is in production, everyone is working much too hard to have time for an actual life. Catering services are huge in Tinsel Town because often nobody leaves the set (or production office, or editing room) for 12- to 16-hour stints. So don’t expect that you’ll be whisked off to Spago or Mr. Chow’s for a chic lunch meeting with Big Name Producer/Director. The reality is that you’re more likely to have a short afternoon meeting in a makeshift production office with some soda, coffee or cookies to nosh on.
That said, one scientist who came to a studio consult jokingly demanded champagne when asked if he'd care for refreshment -- only to be mollified when the earnest young production assistant magically produced a bottle sent to the producers as a gift. I told him if he truly wanted to be shockingly outrageous, he should have demanded a few lines of cocaine. Although even that might not have been shocking. Apparently, it used to be quite common in the 1970s to show up to a pitch meeting and find bowls of coke on the (glass-topped, natch!) coffee table. I heard this from a longtime executive producer, who sighed wistfully in remembrance: "These days it's all just bottled water."
(2) Listen, Don’t Lecture. This is probably the single most common mistake scientists make when consulting with Hollywood for the very first time: they walk into a meeting and proceed to expound on their area of expertise, with little regard for whether it’s relevant to the developing story. This is understandable: scientists are accustomed to certain kinds of communication: giving class lectures, technical talks for colleagues, and an ever-larger fraction are also reasonably adept at speaking to the press about new research results. But Hollywood is looking for more of a dialogue, a brainstorming session among equals -- not a lecture. Remember, they're smart, skilled professionals in their own right; they just have a different expertise than you.
(3) “No” is Not Enough. I've said it before, and I'll say it again. It's not enough to just tell a writer, director or producer that their nifty plot twist is bad science. That's just pointless nerdgassing; it might be cathartic for you, but the goal should be convincing Hollywood that paying attention to the scientific details results in a more successful film or TV series. Instead of "No, you can't do that," make sure you put a positive spin on your input: "Well, that's stretching the science a bit too much, but have you considered this?"
(3A) A corrollary: they'll be more likely to listen to your input if The Science Serves the Story. Hollywood is not in the business of creating PR campaigns for science out of the goodness of their hearts. It will always be about the narrative. Make sure you honor that.
(4) Honor the Non-Disclosure Agreement (NDA). Discretion is very much the better part of valor when it comes to advising on Hollywood projects. Ideas are bona fide currency in this town, and projects in development -- and even in production -- are treated as closely guarded state secrets. Think I'm kidding? I organized a local team of five scientists with varying expertise to consult on TRON: Legacy. They expected to be emailed the draft script. Instead, production assistants brought each scientist an individual copy of the script stamped with their name on every page -- so if pages leaked, it could be traced back to the miscreant -- and waited in their office while they read it, then took the manuscript back to the production office "vault."
Even if nobody asked you to sign an NDA, it's still a good idea to say as little as possible, even if it seems like a pretty trivial detail; not doing so could get you blacklisted from future consultation. So, even though it's tempting to regale your friends down at the pub with tales of your mind-blowing meeting with Big Name Director at a Major Studio, resist that temptation -- until the film comes out or the episode airs. Then you can reap the reward of all that reflected glory. It can also be a great educational opportunity, as Jim Kakalios (author of The Physics of Superheroes) discovered when he made this Webby-nominated YouTube video on the science of Watchmen (for which he consulted):
(5) Your Input Is Never Wasted. Don't feel discouraged if few, if any, of your ideas make it onto the screen in the end. Maybe you consulted on a project in early development that never made it into production, or you were brought in too late to have much of an impact. (I cringed inwardly during one consultation when, towards the end of the meeting, the director commented, "Wow, this would have been really helpful, like, four weeks ago....") Maybe the writers just didn't take your suggestions, because the story ended up going in a new direction, or the studio demanded changes (or gave "notes"). A lot can happen to a film or series in development between the draft script and final cut. That doesn't mean your input wasn't valuable, or that you wasted your time. If nothing else, you've established a good foundation for future interaction. They may call on you again for another project, and next time, your input will make it to the final product.
(6) Expect Small Perks in Lieu of Payment. Look, it's a very rare occurrence when a science/technical consultant gets paid, and even then, I'd advise you not to quit your day job. I'm asked about this constantly: why don't get consultants get paid more often? And I explain that most of the time, when creators need input the most is during the early development stage -- also the stage where a science consultant can have the most impact in shaping the story. But at that point, there's usually no budget, either. Trust me: everyone is working on spec. (Hollywood is a town of freelancers, basically.) Make a strong enough pitch -- for which you need good science input -- and you might get picked up by a network or studio. But it's only when a project gets "greenlit" that it goes into actual production -- and until then, there's really not any money to be made. Be the person who helped them in the early, unpaid stage, and you're far more likely to be approached about paid consulting when the budget finally materializes. Or not. Like I said, don't quit your day job.
Not everyone likes to hear this. I've had more than one scientist stuffily inform me that s/he received so much per hour as a technical consultant for industry, and lawyers received similar rates for their consulting services, so why shouldn't scientists who consult for film and TV be paid accordingly? One such person was so insistent on this point that, exasperated, I finally said, "Look -- you keep telling me how you think things ought to be. I'm telling you the way things actually are." It all comes down to what the market will bear, and currently, the market will bear.... pretty much nothing. This will only change when it becomes clear to the folks who hold the purse strings in Hollywood that a technical consultant is absolutely essential to the success of a given project. And I think their numbers are growing. But a blockbuster film with bad science is still a blockbuster film. So it might be awhile.
That doesn't mean the creators don't care about getting the details right -- they do! -- or that they aren't generous. They are! They'll find some way to express their appreciation. We have a growing collection of DVDs, baseball caps, sweatshirts, even a pen in the shape of a bone (from the writing staff of Bones, of course). The Time Lord is justly proud of his Stark Motor Racing sweatshirt, courtesy of Marvel Studios in thanks for his consultation work on Thor. I've been invited to watch shoots for Bones, Castle, and The Big Bang Theory and toured the set for Tony Stark's lab in the Iron Man films. Helping Hollywood is fun! Having fun, getting to be creative, and hopefully feel you've made a difference in some small way is actually pretty darned rewarding. If those are terms you think you can handle, congratulations -- you could make an excellent science consultant.
"If history were taught in the form of stories, it would never be forgotten," Rudyard Kipling once observed. The same could be said for science. Biologist Sean B. Carroll (not to be confused with the Spousal Unit) cited the power of storytelling during a daylong Summit on Science, Entertainment and Education last Friday, organized by the Science & Entertainment Exchange with funding provided by the Moore Foundation. This is something we started talking about while I was still director of the Exchange, and I was thrilled to see the day finally happen -- a room filled with leaders from all three sectors, brainstorming ideas on how best to combine their efforts to transform US science, technology, engineering and math (STEM) education, by adding one more letter: an "A", for the Arts, giving us STEAM.
By now the depressing statistics are all too familiar: the US ranks #25th worldwide in math, #21st in science (behind countries like Estonia and Slovenia), #27th in percentage of college graduates in science and technology, and a pathetic #48th in the quality of K-12 math and science education. The only area where American students excelled? Self confidence! US students are #1 in thinking they rock at math and science, which would be fine if this confidence were based in reality. It isn't. "The rest of the world is rising and the US is falling asleep at the wheel," Charles Vest, president of the National Academy of Engineering, told the assembled crowd. Improving the nation's standing doesn't just require political will, he emphasized, but also inspiration -- and that's where Hollywood can help, by partnering with scientists and educators to "help us reconnect what we do with what we dream."
I always say that science in film and TV can inspire, but it's not intended to actually teach rigorous science. That said, it's a terrific way of leveraging the appeal of Hollywood -- and the entertainment industry's consummate skill at branding and marketing for mass audiences -- to engage and motivate students. Some of us have even written books using, say, the adventures of tiny blonde vampire slayers to illustrate physics concepts. Imagine all the untapped potential for things like DVD bonus features, interactive online gambits like LOST University, and -- someday -- original online teaching tools tied to film and TV series, and in line with broad curriculum requirements, available as a resource for teachers nation-wide. That's the vision, anyway. From the Summit website:
We know that this is a complex, systemic issue and that there are no magic bullets for solving the problems. But one way to encourage interest in science is by capitalizing on the pre-existing interest in entertainment. Film, television, and other forms of popular media have the very real potential to engage students in learning about many aspects of STEM and to generally increase their interests in these disciplines as possible career options. This has been demonstrated by the increase in the number of students studying forensic science after exposure to the popular television series CSI: Crime Scene Investigation. This is genuinely informal education: Learning something about the practice of science and the characteristics of scientists themselves through the lens of entertainment television programming. Despite some of the inaccuracies in this portrayal, it leaves us wondering how formal education can take greater advantage of the ability for film, television, and video games to engage students using entertainment programming as jumping off points for deeper learning or of wholly new content developed by working closely with content creators in the new media world.
So last week's summit was a jumping off point for exploring ways these three communities might do that. If you drew a Venn diagram for science, entertainment and education, storytelling is where they would all overlap. As Carroll pointed out, today's cognitive psychology tells us that "human thought is structured around stories"; a strong narrative framework presents a structured, coherent argument for whatever information is being presented, and makes it far more likely that people will retain that information. We can even follow Kipling's lead, drawing on the thousands of untold stories in the history of science to inspire the entertainment industry -- which in turn can provide fodder for the education community to use to inspire and motivate students in the classroom.
For instance, Carroll talked about Roy Chapman Andrews, who headed a Mongolian exploration back in 1922, accompanied by a famous Hollywood cinematographer named James B. Shackelford. This was no guided safari: Andrews was armed at all times (both rifle and pistol), to defend the expedition from local bandits. But the hardships were worth it in the end: He and his team found tons of dinosaur fossils during their trip, along with the first discovery of dinosaur eggs (in nests!). That, and Andrews' colorful swashbuckling tales, landed him on the cover of TIME magazine, and may very well have inspired the 1940s B movie serials that led to the creation of fictional swashbuckling archaeologist Indiana Jones. (Carroll mentioned just one little-known fact: Andrews apparently hated snakes. Coincidence???)
Of course, in addition to scientific storytelling, we also need to convince students that science is relevant -- we need to answer the age-old "When am I ever going to use this?" question -- and creative by making it less monotonous and boring. So says Tony DeRose of Pixar, who started out as a computer scientist, then taught for several years before ending up in the entertainment industry doing award-winning animation on such hits as The Incredibles, Finding Nemo and the Toy Story franchise -- "So this summit is really all about me!" (Jen-Luc Piquant loved the fact that DeRose was sporting a colorful Ratatouille print shirt.) "People don't realize how much math is in Pixar movies," he said, citing coordinate geomtry, trigonometry, matrix algebra, even math incorporating the fourth dimension. In fact, he insists it's possible to use examples of animation technology and math to augment the standard curriculum through college sophomore year.
There's also been plenty of new mathematics developed by and for the entertainment industry, which is why DeRose also emphasizes the importance of bringing creativity back into the classroom: "We need to train students for tomorrow -- for technologies that don't yet exist." Those students will be the ones inventing and using those future technologies. Encouraging creative innovation is the objective behind the Young Makers program, culminating in the annual MakerFaire in the Bay Area, which drew 600 exhibitors and over 80,000 attendees in 2010. (For those who aren't fortunate enough to have a Pixar compuer scientist as a father, the program now has a mentorship component built in.)
DeRose engages in these sorts of activities all the time with his two sons: one year, they built a giant potato-shooting "Gatling gun"; another year, they built an eight-foot-tall pneumatic fire-breathing dragon named Saphira, a project that required the boys to learn about welding, metalwork and how to read schematic diagrams, not to mention developing a safety plan and building in an emergency kill switch. And that's another pet peeve for DeRose: our educational system actually discourages risk-taking in students: "We teach kids how to avoid risk, when we should be teaching them how to manage risk." He has teenaged boys, and "They sometimes want to do dangerous things." Rather than tell them no, he teaches them to take those risks into account when designing, say, a giant fire-breathing dragon.
DeRose's remarks on risk actually came up during a brief breakout session to allow attendees to chat with the morning's speakers. And that's when this happened:
Yes, that's the Spousal Unit, Sean M. Carroll, finally meeting his doppelganger, Sean B. Carroll, and making an odd sort of in-joke history in the process. Somehow the space-time continuum survived.
"Most people never discover what they're good at," according to Sir Ken Robinson, and those that do usually have to "recover" from their formal education first. For Robinson, true learning is about combining imagination, creativity and innovation -- and our current emphasis on STEM as being somehow separate from the humanties/creative arts isn't a good approach either. There are more synergies to be explored at the science/art (and entertainment) interface than walls between them. He told an entertaining story of serving on an interdisciplinary educational commission with members from both science and the arts. One was a British comedian named Lenny Henry ("a British Chris Rock") who heard that Nobel laureate Harold Kroto was also a member and panicked, convinced he had no business being in such august company. What the comedian didn't know was that Kroto, upon hearing that the comedian would be on the commission, had the same reaction: he was intimidated by the others' fame and talent in the arts.
His point? "People are intimidated by others' disciplines, but we share the same anxieties, and the same approach to creativity," Robinson argued.The status quo in education is "leaching the life blood of the respective disciplines. We need a culture of education," and to do that, we must "make common course with other fields." My favorite Robinson quote came after he commented that the power of human imagination is ultimately what separates us from other animals: "A dog might get depressed, but it doesn't listen to Coldplay and get drunk." I also liked his emphasis on how creativity is about taking action and doing something -- or, as artist Chuck Close observed in a recent interview, "Inspiration is for amateurs." Nor is it about total freedom: discipline and constraint are absolutely essential to coming up with truly original ideas.
Okay, so all day we'd been hearing about education from the perspective of the policy folks, the scientists, and the entertainers (not to mention knights of the realm with plummy British accents) -- what about the voices of those in the trenches? Among the afternoon's featured speakers was award-winning high school science teacher Janet English, best known for taking a team of her students aboard NASA's vomit comet to conduct science experiments in zero gravity. (While English found her experience exhilarating, the day's emcee had the opposite reaction: "If you have motion sickness [in microgravity], you find out what it's like to throw up in front of Buzz Aldrin. You will not get me back in that metal canister of hell!")
Anyway, English brought along 15 of her students, who spoke of their abiding love for the Mythbusters, for teachers who don't just throw facts at them, but encourage them to explore and do experiments (say, by building their own trebuchet!), and of their love storytelling.
English's experiences in zero gravity provided ample examples of her approach to teaching. It's one thing to memorize the laws of gravity and the Newtonian concept of inertia. (Snore.) It's quite another to experience it firsthand in a microgravity environment: suddenly, just the tiniest upward force will send you shooting toward the ceiling, and you won't stop or slow down until you hit that ceiling. English uses this to get her students thinking about what we take for granted back on Earth, and how even common activities might change in microgravity.
For instance, can you hula-hoop in microgravity? English posed this question to her students, and did the experiment during her stint on the vomit comet. (The answer is no, although you do get a very nice temporary facelift.) Can a spider spin a web in microgravity? That experiment has also been done, and the answer is a qualified yes -- the spider adapts to the new environment within a few days. (Before then, no, the spider is hopelessly at sea.) Okay, so what about the famous light saber duel between Darth Vader and Luke Skywalker? Yes, nerdgassers, we know that REAL light sabers wouldn't work like that, but English provided a re-enactment using two toy light sabers, wielded by one of her students playing Vader and TRON: Legacy producer Jeff Silver as a bearded Skywalker. ("Jeff, I am your father.") The toy sabers make a satisfying smack! on earth, but in microgravity, the minute the sabers touched, "Skywalker" and "Vader" would spin in opposite directions -- in accordance with Newton's "equal and opposite reaction."
The process of learning incorporates both play and storytelling, according to Will Wright, a game developer best known (thus far) for Spore. He insists that "Play is the natural way we interact with the world," and videogames exploint that kind of learning process. Gamers "observe a result, form a hypothesis, test, and discard or accept that model, thereby intuiting the underlying rules of the system." Not only are players driving the experience, but they are allowed to fail. To illustrate the importance of the freedom to fail, Wright -- or maybe it was DeRose during the morning breaout session -- told of a study in a pottery class, where half the students were graded according to the quality of their finished pots, and the other half were graded according to quantity: the number of pots they made. In the end, the latter group had more failures... but they also had more high-quality pots than the group that was discouraged from experimenting and taking risks.
I learned this critical lesson through the process of earning a black belt in jujitsu: I started out clumsy and unskilled, failed repeatedly -- which in martial arts usually involves getting knocked on your ass over and over again -- but kept trying until I got something right. Real learning has more to do with discipline, hard work and perseverance than with innate ability. Gamers also learn through failure -- they keep trying until they master one level, before advancing to the next -- but it's failure that takes place in a virtual space, a kind of "safe haven" that encourages them to experiment, to take risks, to try new things that might not work, but hey, now they know it doesn't work and can move on to try different strategies In short, it's an "apprenticeship model" that teaches them to think like scientists. (That said, you can restart a game: "You can't restart the universe." But oh, wouldn't that be awesome!!!)
Liz Vogel, director of education for Walt Disney Studios, also emphasized learning through play. She started out as a scientist, then taught for a bit, then found herself at Disney -- yes, just like DeRose. Her scientific colleagues' reaction wasn't always positive: "Oh, you're going to sell out," she heard more than once. (Le sigh.) But she also encountered outmoded attitudes at Disney, where folks had the notion that "fun" and "learning were at opposite ends of the spectrum.
Vogel ultimately convinced her Disney colleagues to establish programs with four critical elements: (1) it must be child-focused, (2) it must be interdisciplinary, (3) is must me collaborative and involve mentorship in some form, and (4) it must be interactive, fun and engaging for the students. Oh, and it's imperative to bring in educators from the beginning, rather than at the very end to rubber stamp one's efforts. Of course, all this costs money. Ultimately, said Vogel, "You need a spectacular plan that shows some return on investment." This needn't be profits, per se; promoting the "brand" in the eyes of the public is a currency all its own.
That said, pure "edutainment" isn't the answer either, which all too often demonstrates a "failure to engage with the wonder of ideas," according to Columbia University string theorist Brian Greene. "It's not just about explosions and confetti and over-saturated colors."
But you do want to create something that students will be talking about. Producer Jerry ("Don't call me Shirley!") Zucker -- mastermind behind the Airplane! movies, Ghost, and Kentucky Fried Movie, as well as a co-founder of the Exchange -- asked whether educators should start being more concerned with "selling" STEM education to their students, because as it stands, "The kids aren't buying it." Furthermore, "Hollywood feels no sense of obligation; it is not in the business of altruism." But Zucker thinks the two goals need not be mutually exclusive: to his mind, it's eminently possible for Hollywood to work with scientists and educators to create effective teaching tools using storytelling, and turn a profit in the process. "I think there's an enormous business opportunity there."
It wasn't all talks by invited speakers and breakout sessions. There was a special performance by a cappella singing sensation The BackBeats (doing a killer medley of Lady Gaga numbers, among other bits), and impassioned "spoken word" performances by a pair of actor/poets, Steve Connell and Sekou Andrews. Sekou performed a powerful solo piece riffing on the concept of Venn diagrams as applied to education that brought the house to its feet in a standing ovation. I found a video of an earlier performance of the same piece online:
Connell performed earlier that morning, in a spoken word piece that riffed on his childhood love of superheroes and the day his mother told him that superheroes weren't real -- at least not the ones in the comic books. Instead, he learned that "Superman is a single mom waitressing at Denny's" to make sure she can feed her kids, and countless other everyday folks who press on against the odds. Best line: "It's not heroic to take a bullet when you know you can't be killed."
And you know, teachers are superheroes too, struggling against the current to reach kids who don't yet know what it's like to be fully engaged with learning. And since each student is different, an approach that works for one might not work for another: "Good teaching doesn't scale," lamented Zucker. And as high school teacher Tyler Johnstone pointed out, there's a lot of money invested in keeping things the way they are. But he believes it's still possible to create "pockets of innovation" to supplement formal education/curriculum.
So, after all that shared wisdom and insight and talking about the problem and discussing potential solutions, it's fair to ask: what next? What are we going to do about it? In that regard, last Friday's summit was just the beginning, a means of getting the conversations started, of planting seeds that, it is hoped, will give rise to successful collaborations between science, entertainment and education. (There's an online forum to ensure the participants stay in touch.) The first step: The Moore Foundation is offering a $225,000 grant to "establish collaborative partnerships among scientists, entertainment industry professionals, and educators to develop educational products or services that effectively leverage the resources of the entertainment community (including film, television, and video games) to improve educational outcomes in science classrooms." The deadline for proposals is May 16th, and for those who are interested, forget just compiling a bunch of existing clips loosely organized around broad topics in science:
We are seeking ideas that go beyond taking existing entertainment media (e.g., clips from a film or television series; segments from a video game) and developing curriculum-based lesson plans around them. Rather, we are looking for true partnerships among the three stakeholder groups to develop ideas that are fully integrated among the science, entertainment, and education communities. Winning ideas will benefit the entertainment community by helping to raise awareness of a film, television show, or video game and will benefit the science and education communities by offering fresh approaches to engaging students in STEM-based topics.
Check out the link for more specifics about how to apply. I expect there will be naysayers and the usual skepticism from the "meh" contingent, but since money is always a limiting factor in such discussions, let's at least give props to the Moore Foundation for putting their money where their mouth is. And as Sir Ken Robinson said, creativity is more about doing, and that, in turn, leads to innovation -- even though there is also a high risk of failure. Remember the story of the pottery students; you have to ruin a few pots before you can achieve greatness. So why not take a risk with the goal of possibly making a meaningful difference? The stakes for our schools, and for future generations, could not be higher.
This semester, I'm teaching a class I haven't taught before: Writing Research Papers (I know: why didn't my students learn that in high school? That's another post.) I'm loving it because I've always enjoyed research and it makes me go back and think about the fundamental methods and questions of the activity. What do you look for in a fishing trip? How do you find reliable sources? How can you tell if a source is or isn't reliable? How do you formulate your questions? What makes a good hypothesis or thesis? How do you interpret data? How do you recognize your own biases? And one of the most important questions (at least I think so): how do you frame both your questions and your answer?
These questions apply to all kinds of research, whether what you're looking at is literary, historical, social, psychological, or hard science data and sources. Research, even when all you're doing is a review of the topic, ends up with some kind of focused point: here's what the trends are, here's what we know, here's what seems to be an answer to these questions, here are the caveats, here's what we don't know. Part of your answer always depends on what question(s) you ask. It's crucial to keep an open mind, no matter what you're researching. Very often, on the way to looking for one answer, you find one you didn't expect, or the answer to a question you hadn't thought to ask. This is why I've always loved poking around in libraries and the time I've spent in school and college labs, even running experiments that already had answers. It's the process as much as the product that's so enjoyable and instructive.
I was reminded of the importance of recognizing personal and cultural biases twice this week. One of the papers my students are working on is a comparison of the lives and work of Malcolm X and Dr. Martin Luther King, Jr. My students are predominantly African-American and, like me, hold both figures in high esteem. I warned them that when they went fishing, they were likely to find events in both men's lives that painted them in a less-than-rosy glow. No matter what great things we might do, we're all human, and subject to human weaknesses. For some, this was a new thought, and for many it was troubling. It always is to find your heroes have clay feet. But it's an important step to take in intellectual growth. Can you still admire the accomplishments of someone, even after you discover they're not as saintly as you thought? Which matters more?
The other reminder was this article on WebMD, posted by a Facebook friend, that explains the findings of a study of the correlation between working mothers and childhood BMI. I'm not even going to go into what a crap standard BMI is as a measure of health. That's a topic for another post too. What jolted me about this article was the way the question had been framed: "Maternal Employment, Work Schedules, and Children’s Body Mass Index" [emphasis mine]. Really, I thought we were over this "blame mothers for everything that goes wrong with kids," but apparently this bias persists even in research women do. In the past, mothers have shouldered the blame for their children's schizophrenia, anorexia, and alcoholism, even adult sexual dysfunction. It's still a current motif (PDF) in autism treatment and studies.
The major problem with this idea, especially in late 20th and early 21st century developed countries is that it ignores the social changes in child rearing that have occurred with women entering the workforce. Women are no longer the sole caregivers for children or primarily responsible for such chores as cooking meals or grocery shopping, both of which figure in what children eat. Why, then, were father's activities and timetables not included in this study? Why are we not asking how having both parents working and the work schedules that regularly hinder both parents from spending more time with their children affect what children eat? Because, as a culture, we are still deeply ambivalent about women in the workforce and still see them as children's primary caregivers. Although 64-78% of mothers are in the workforce full time, the continuing lack of salary parity, number of women CEOs, dearth of trustworthy child care provisions and paid maternity leave are evidence that our society as a whole does not really value women's presence in the workforce (PDF).
There are, of course, other reasons for these problems, but a bias against women (and their ability to raise a family and work outside the home, one that does not exist for men) is certainly the main contender, and that bias, like racism, is just as often unconscious and shared by the people who are its objects, including female researchers. In this case, there are studies that look at the involvement of both parents in their children's eating habits (in Germany) and many more on women's influence, but none that I could find that look at only the effect of paternal employment on children's eating habits and obesity. Google, in fact, kept asking me if I meant "parental" instead of "paternal." It's not that the question isn't important, but that the other side of the question, or the collective question needs to be asked too. In fact, another study from the University of Maryland discovered "that paternal employment plays a significant role as well." In the WebMD study, we have what's called a selection bias; the researcher's have chosen to study only part of the group which may be a causal factor.
But the way the study is reported—its framing—is also part of the problem; it creates a perception that women are solely responsible for children's eating habits or nutrition, which clearly isn't the case. The WebMD article says,
Bottom line: The longer a mom's employment — whether she's toiling at a regular 9-to-5 job or works irregular hours — the more likely her child is to gain more weight than is healthy.
"This is not a reason for moms to feel guilty," Morrissey tells WebMD. ''It’s not maternal employment per se that's the issue. It's an underlying environmental factor that leads to this association."
What that factor (or factors) is has yet to be uncovered, she says.
The way this is phrased, it's almost impossible for moms to not take home the message that it's their fault if their kid is fat. Until that underlying environmental factor is pinpointed, we have only the correlation of maternal employment and increasing obesity, about which we can do very little. Leaving father's out of this equation reflects and exacerbates a bias already present in the culture.
Biases exist in every research study, in every discipline. The kinds of questions we ask and how we ask them will reveal those biases if we're aware and thorough enough to examine them honestly. The best we can do is try to control for them and acknowledge them. And people communicating those results to the general public need to be careful about not perpetuating their own biases too. It's not only our heroes who have clay feet; we do too.
So, what do you do when you're getting over the flu, and aren't completely incapacitated in a drugged-out stupor, but also not exactly 100% and therefore unable to be very productive? Such has been my fate the last couple of days. My solution? Marathon viewings of Season 1 of the short-lived TV seriesVeronica Mars. You remember, the one starring Kristin Bell as the sassy, sarcastic pint-sized blonde love child of Philip Marlow and Nancy Drew. She was definitely in the same class as Buffy, so it's not surprising that Joss Whedon was a fan. Even his considerable coolness clout couldn't save the show from cancellation after only three seasons. (You can watch some great Season 2 highlights here; embedding has been disabled.)
At least we'll always have the DVDs to remind us of what we once had. I'd forgotten just how whip-smart this series was. Veronica is tough, vulnerable without being whiny (a recurring problem for Buffy, alas), smart, and self-assured, with a PI dad who encourages her natural inquisitiveness, while worrying about her safety. ("I think I'm going to get you one of those big plastic hamster balls, so you can roll around poking your nose into things without getting hurt.") Season 1 had one of the best fight scenes I've ever seen on TV: Veronica's dad, Keith Mars, facing off against the unmasked murderer of her best friend in a very realistic fistfight -- between opponents who really have no idea what they're doing. They swing wildly, tussle blindly, get tired quickly, and there are no flashy Jet Li-style martial arts acrobatics -- yet the stakes couldn't be higher.
As great as the supporting cast is, everything rests on the character of Veronica, who is the perfect role model for any young girl who cares about thinking critically. Keith Mars jokingly tells a colleague that the first words out of Veronica's mouth were, "Their case is fuzzy and circumstantial."(Thanks to commenter Casey for the correction.) I just think the evidence is really weak." That's her superpower: she understands the nature of evidence, the need to build a case. And that makes her quite the little scientific investigator, even if she's more concerned with solving murders, uncovering blackmail schemes, and exposing cheating spouses -- in between high school classes, of course -- than unlocking the secret workings of the universe.
Most important, when the evidence tells her something she doesn't want to hear -- she believes the evidence, no matter how painful the truth may be. Nowhere is this more clear than in a Season 1 episode where Veronica's favorite history teacher is accused of sleeping with and impregnating another student. [WARNING: MANY SPOILERS TO FOLLOW!!!] Veronica sets out to prove his innocence, pitting her against her own father, who has been hired by the victim's parents to prove the teacher's guilt. Keith warns his daughter -- with his vast life experience -- that she's setting herself up for disillusionment; heroes always turn out to have feet have clay. But he doesn't stop her from investigating. He knows she needs to figure some stuff out for herself.
The teacher is so disarmingly cute and charming, so eminently likeable, and speaks so eloquently about his love for teaching and how it's all he's ever wanted to do. His accuser is beautiful and bitchy, with a history of telling lies and spreading gossip, including lies about Veronica. And sure, it turns out that Veronica's instincts were partly right: the girl was lying... on behalf of another girl too ashamed to come forward publicly. But the teacher was guilty as hell. So a good teacher in public can still do bad things in private, and the high school bitch can show some altruism and go to the mat for an innocent victim. And Veronica? She believes the evidence. She admits she was wrong.
A few months ago, I ranted -- as one does sometimes -- about the fact that science has so many fair-weather friends: people who love science and think it's cool, but only when it involves cool gadgetry, quirky stories, or life-saving remedies. The real test is, how do you react when science tells you something you don't want to hear, that challenges your assumptions and prevailing worldview? We need to be more like Veronica Mars. Curious, inquisitive, fearless about ferreting out the truth, even when the truth hurts like hell, rather than staying in safe, cocooned intellectual hamster balls. As Veronica says, "Nobody likes a blonde in a hamster ball." Sometimes the truth hurts. You deal with it.
Three years before he died, paleontologist Stephen Jay Gould, one of my favorite popularizers of science, published a book called Rocks of Ages: Science and Religion in the Fullness of Life, basically consigning the two subjects to different realms of thought and influence. Gould maintained that the two areas of inquiry asked and answered different questions that need not impinge on each other. The reaction was, well, reactionary and not all of it truly thoughtful. I thought at the time that many of the reviewers were hell-bent on developing their own Theory of Everything that reconciled all areas of human thought. Many scientists, skeptics, and atheists (sometimes embodied in one person) roundly denounced the work (and Gould, in the kind of low-blow ad hominem attacks they decry in others, I might add) for daring to try to "legitimize" religion or spirituality in the same breath as science.
This post is not about that.
I mention it because, as was inevitable, the question of whether a scientist can have any kind of religious faith or even just entertain the idea separately from his or her own professional work has now reached the courtroom stage in the case of Dr. C. Martin Gaskell, a University of Nebraska astronomer who was turned down for a post by the University of Kentucky because an internet search of his name revealed he was an evangelical Christian who wasn't shy about writing about astronomy and the Bible. Gaskell has taken pains to point out that he is not a creationist and he does not have any problems with the theory of evolution. A department staff member, Sally A. Shafer,
found links to his notes for a lecture that explores, among other topics, how the Bible could relate to contemporary astronomy. “Clearly this man is complex and likely fascinating to talk with,” Ms. Shafer wrote, “but potentially evangelical.” . . . Francis J. Manion, Dr. Gaskell’s lawyer, said: “I couldn’t have made up a better quote. ‘We like this guy, but he is potentially Jewish’? ‘Potentially Muslim’?”
Put in those terms, this becomes not just an issue of scientific accuracy and honesty, but of censorship and, yes, plain ole bigotry.
The job Gaskell applied for was running the new UKentucky student observatory, which also involves lecturing publicly about science. Keep in mind that UKentucky is not far from the Creation Museum in the heart of the Bible Belt, which may have contributed to their jitters about hiring someone they perceive as working for the other side. But it may be that it's that perception that's the problem. One of the basic rules of discrimination and bigotry as that it lumps large numbers of people together in a single group without regard for individual differences. The terrorists who took down the Twin Towers in 2001 were Muslim; hence all Muslims are terrorists. Stated so baldly, bigotry is laughably simplistic to anyone with the ability to analyze and think for themselves—a trait I would hope scientists and education leaders would possess in abundance.
Now, bear with me for a moment and imagine you are a practicing Christian—not an unquestioning, blindly faithful zombie Christian, but a thoughtful, questioning, testing-your-faith kinda Christian—living near the University of Kentucky where Dr. Gaskell has just been hired, and you see an advertisement for this talk by Dr. Gaskell: "Modern Astronomy, the Bible and Creation." Back in the day, when I was a Jehovah's Witness, I had a really healthy curiosity about Life, the Universe, and Everything and a talk like this would have more than piqued my interest. Quite likely, I'd have trundled off to hear it, possibly dragging one or two others of my equally curious JW friends with me. Here's what I would have heard (PDF) according to his own summary:
I give my responses to some of the questions I am most frequently asked on the subject of the Bible and modern astronomy. I start out by emphasizing that many scientists and philosophers have strong religious beliefs and I give some quotes from famous scientists and philosophers. I list, and briefly discuss, some of the main theological interpretational viewpoints of the creation stories in Genesis. It is explained that there are more than just two extreme views on the origin of the universe and that the majority of scientists who are Christians adhere neither to the view that the Bible is irrelevant to the earth's origin (which exponents of atheistic evolution claim) nor the view that God made the earth essentially as it now is in six 24-hour periods about 6000 years ago (the “young earth creationist” position.) [emphasis mine] The origin of Bishop Ussher's date of creation is explained and the question of “days” in Genesis 1 is discussed. Examples of where modern astronomy is supporting the details of Genesis 1 are described. A list of suggested readings for those who wish to read more about Christianity, the Bible, and some of the scientific issues is appended.
Gaskell goes on to say that, "The main controversy has been between people at the two extremes (young earth creationists and humanistic evolutionists). 'Creationists' attack the science of 'evolutionists.' I believe that this sort of attack is very bad both scientifically and theologically. The 'scientific' explanations offered by 'creationists' are mostly very poor science." "Mostly very poor science," huh? Hmmm. And that would have piqued my interest too. Why is it poor science, I would have wondered? Further investigation would have followed—and did, in a similar situation, leading me to where I am now: skeptic in fact if not by affiliation, and Buddhist fellow-traveller.
Honestly, none of Gaskell's talk sounds Creationist to me. What Gaskell is actually doing is finding common ground with his audience, in this case the Bible, to talk about science, without distorting either. This is something Jennifer does with just about every post she writes, but her common ground is pop culture. And as a former fundy science nerd, I can testify that science history this reasonably presented would have been greeted with interest by any but the most fundamentalist of Christians, who are probably already a write-off. But that's not what happened in Kentucky. There was nothing reasonable about the response in Kentucky. Oh, no. There was, instead, a "rush to judgment."
In recent years there's been more than a hint of the hysterical witch hunt in the voices of some skeptics and scientists crusading (and yes, I use that word intentionally) against creationism and Intelligent Design. Phil Plait, the favorite Bad Astronomer of Cocktail Party Physics, addressed this at a recent TAM meeting in his inimitable way, in a talk called "Don't be a Dick":
Rather than seeing someone like Gaskell as a possible bridge between the reasonable, questioning, curious Christian community (and there is one; I've been part of it), UKentucky freaked out about a possible PR nightmare in hiring someone perceived as a narrow-minded pseudo-scientist. One thing I don't think you can accuse Dr. Gaskell of is being a pseudo-scientist. If you skim his publication lists (he's now at the University of Texas), you'll see he's co-authoring with legitimate scientists in his field, and publishing in all the usual places that "real" astronomers publish in. Not the Discovery Institute, but the American Astronomical Society's journal, and other well-known scientific journals.
Now, call me crazy, but I always thought the purpose of a university was to offer education. It's hard to educate people if you don't speak at least some of their language. Most Christians—most religious people of any stripe—feel that scientists not only don't speak their language, but are only interested in belittling them, not in having a reasonable conversation with them. So even if you have questions, as a religious or spiritually inclined person, who are you supposed to ask, when the scientists will just mock you? As Phil says in his talk, we should be "relying on the merits of the arguments, which is what critical thinking is all about, what evidence-based reasoning is about." Not vitriol. Not bigotry. Not prejudice.
The truth about the history of scientific thought that many modern scientists would like to shove under the rug is that it sprang out of the only educated community in the middle ages and Renaissance: church clerics. Before the Age of Enlightenment was the Age of Enlightenment, it was the Age of Faith, and you can take the boy outta the church, but you can't take the church outta the boy. References to God and creation are everywhere in the history of scientific inquiry, even if only used metaphorically. Why not use them, as Gaskell does, as a lever to open the doors of blind faith just a crack, to slip in some scientific fact? It accomplishes more than just telling people they are fools and morons. Skepticism isn't teaching people what to think; it's teaching people how to think. You don't accomplish that by telling them that everything they know is wrong.
I'm glad Gaskell is bringing this issue to court, because it's something the scientific community needs to confront about itself. By tarring all spiritual seekers with the same brush of ignorance, extremists in the the secular world in general and the skeptical community in particular reveal their own fear of the Other, the same kind of cheap, petty, ignorant fear that white supremacists, jihadists, and homophobes display. Not nice company to be lumped into, is it? Fear isn't rational, though. And that alone should wake you up, if you're one of those frothing at the mouth skeptic/atheists. Use the rational mind that God gave you, for Pete's sake.
A couple of weeks ago, an editor asked me to name my favorite science book from 2010 for a year-end round-up her magazine was putting together. My incredulous response: "You mean you want me to pick just one?" Because let's face it, 2010 has been a banner year for popular science books. Never mind the unstoppable juggernaut that is The Immortal Life of Henrietta Lacks (and kudos to Rebecca Skloot for bringing science back to the bestseller lists); 2010 also saw Maryn McKenna's Superbug; Deborah Blum's The Poisoner's Handbook; Misha Angrist's Here is a Human Being; Annie Paul's Origins; Jonathan Weiner's Long for This World; and Mary Roach's Packing for Mars. And that's just scratching the surface, based on a quick perusal of my groaning bookshelves. Did I mention the Spousal Unit's From Eternity to Here and my own humble offering, The Calculus Diaries? Consider them mentioned. Heck, Carl Zimmer even ventured into the world of e-publishing with a collection of his Discover columns on neuroscience, aptly titled Brain Cuttings.
The steady stream of science books hasn't stopped, either, so I thought I'd highlight just a few of the new offerings (mostly math and physics related) that came out this fall -- just in case you're looking for the perfect gift for the science enthusiast in the family. (Full disclosure: not only did I receive ARCs of most of the books below, I'm personally acquainted with several of the authors. The exception is Connie Willis; I'm a bona fide fangirl in that case.)
Written in Stone: Evolution, the Fossil Record, and Our Place in Nature, by Brian Switek. I get warm and fuzzy just thinking about this book, since I watched Brian struggle with it in the earliest stages of development. So yeah, there's some personal bias at play. But I'm delighted to say that it came together beautifully -- it's a commendable work by a promising young science writer with a bright future ahead of him. I write primarily about physics and math, so the subject matter of Brian's book -- evidence for evolution in the fossil record -- was largely new to me, making me the ideal reader for this insightful introduction to the topic.
He starts off with a bang, opening with the ruckus raised in May 2009 over the unveiling of the ancient and highly photogenic fossil affectionately known as "Ida" -- wrongly dubbed a "missing link" in the frenzied press coverage that introduced Ida to the world. That was actually as much the fault of those who discovered her -- the whole affair was carefully orchestrated for maximum exposure and, frankly, personal profit -- and Brian gives an excellent summation of the events leading to the media circus. (Fortunately for science, the general public probably remembers very little by now, save, "Hey, wasn't Ida that really cool fossil?" And gosh darn it, Ida is still pretty cute.)
But the real significance of Ida -- and the reason Brian chose to open Written in Stone with that story -- has to do with the "missing link" claims, and the public's misperceptions about evolution. The iconic image of evolution is the March of Progress, showing the progression from early primate to modern man -- a notion that Brian rightly points out has its roots in the Renaissance notion of the Great Chain of Being. And while Creationists love to spout off about how ridiculous it is to assume we came from apes, what evolution actually claims is that mankind and apes share a common ancestry. There is a difference between those two statements.
Evolution is far more complicated, and this forms the central thesis of the book. Our journey through the fossil record, and encounters with such fascinating historical figures as Nicholas Steno, the charlatan Albert Koch, and Athanasius Kircher (one of my all-time favorite historical figures), serve to illustrate one basic point: evolution is more of a branching process, often taking many different paths (even if the end result is similar), with one species evolving and another staying largely unchanged -- a constantly shifting dance. It's kind of messy, with progress occurring in fits and starts -- the furthest thing from the idealized March of Progress. I'll let Brian have the last word:
"For to ask 'What makes us human?' assumes that there was some glorious moment, hidden in the past, in which we transcended some boundary and left the ape part of ourselves behind. We forget that those are labels we have created to help organize and understand nature.... There was never an 'ascent of man,' no matter how desperately we might wish for there to be, just as there has not been a 'descent of man' into degeneracy from a noble ancestor. We are merely a shivering twig that is the last vestige of a richer family tree."
Proofiness: The Dark Art of Mathematical Deception, by Charles Seife. I used to hang out with Charles in the press room at American Physical Society meetings as a budding young science writer, and his classic book, Zero: The Biography of a Dangerous Idea (still in print!) made me realize that the world of numbers could be as fascinating as physics. With Proofiness -- and with that title, why has Charles not yet been on Colbert? Why? -- he tackles the myriad ways our cultural innumeracy blinds us to the many deceptions perpetrated by a misuse of numbers, particularly statistics and probability. There's a lot about election polling and census results, these being hot topics of the day, but even if you're not particularly interested in those, Charles has such an engaging style and wry wit that his prose is bound to draw you in. Also? The cover design is really cool. In this case, you really can judge the quality of the book by its cover.
The Amazing Story of Quantum Mechanics, by Jim Kakalios. The author of The Physics of Superheroes is back with another installment, this time exploring how quantum mechanics changed the world and ushered in a future very different from the one envisioned by the classic comics of the 1950s. We were promised jet packs and flying cars, dammit! And I'm still bitter about the lack of progress on human teleportation. I was struck by a comment Jim made this past summer when we were both on a science panel at CONVergence/Skepchicon in Minneapolis. Someone asked what he thought would be the technological breakthroughs of the next 50 years, and he replied that anything requiring huge breakthroughs in energy would probably not transpire -- but anything related to the explosion in information? Now that would be something capable of transforming the future.
That's kind of the underlying premise of The Amazing Story of Quantum Mechanics: we didn't get jet packs or flying cars, or unlimited supplies of free energy, but we got tons of amazing things we weren't expecting at all. We got atomic bombs, nuclear magnetic resonance (and MRI), lasers, death rays, MP3 and DVD players, spintronics, and the World Wide Web. This is a fantastic primer on the intricacies of the quantum world, using entertaining examples from -- yes -- classic comic books to illustrate his points. Along the way, we are treated to a broad overview of some of the coolest things quantum mechanics has given us, and a sneak peek at what might be in store.
Massive: The Missing Particle That Sparked the Greatest Hunt in Science, by Ian Sample. Good news for fans of objectivity! I don't know Ian Sample personally! So when I tell you that Massive turns the dry-sounding hunt for the Higgs boson into the equivalent of a scientific detective story that you can't put down, you know it's not coming from a biased perspective. Also? There's only one mention of the dreaded "god particle" -- a nickname, coined by Leon Lederman (who co-authored the popular book), that is universally loathed in physics circles, and badly misunderstood by the general public as claiming it holds the answer to spirituality. Of course, it has nothing to do with religion, or the existence (or lack thereof) of a god.
In an intriguing side anecdote -- one of many -- Sample writes that Lederman originally wanted to call his book The Goddamned Particle because it proved so difficult to find, but it was shortened to The God Particle. For Lederman, the name is apt because the Higgs (writes Sample), "is critical to our understanding of matter, yet deeply elusive." (More literal-minded sorts miss the subtlety.) That's the kind of vivid detail and backroom chatter that makes Massive such a compelling read: it's about science as that science is being done, and we don't yet have all the answers -- the Higgs continues to elude us. But for anyone curious about the story of the Higgs so far, you're not likely to find a better book than Sample's on the subject.
How I Killed Pluto and Why It Had It Coming, by Mike Brown. You might know Mike by his Twitter handle, @PlutoKiller (it's an entertaining feed; you should follow him). Clearly, he takes a certain amount of pleasure in his role demoting this smallest of planets -- or, in this case, former planet -- even though it means he gets a steady stream of hate mail and a surprising number of obscene phone calls. People have an unusually strong passion for Pluto. But Brown didn't actually set out to cause such a ruckus; he was just going about his business, hunting planets, and what he found was Eris, briefly touted as a "10th planet" before astronomers decided it didn't really meet the criteria -- and if Eris didn't qualify, neither did poor Pluto, or any of the large number of similar objects that have come to light in recent years.
Like Sample's Massive, Brown's book gives us that rare glimpse behind the curtain, a peek at how science is actually done. The guy can spin a yarn, that's for sure, and he's got some great material, and a great sense of humor (and perspective!). Even those who champion Pluto's eventual return to planetary status -- yes, the debate rages on -- will find it pretty difficult to continue hating Brown after reading this book; he's just too damned likeable. As James Kennedy wrote in his Wall Street Journal review, Brown's book presents "the scientist neither as madman nor mystic, but mensch."
Blackout and All Clear, by Connie Willis. Finally, what holiday book list would be complete without a spot of science fiction, specifically of the time travel/chaos theory variety? This is a sprawling, two-book epic, mostly set in London during World War II, when the residents suffered rationing and nightly air raids/bombings at the height of the Blitz, yet still managed to carry on some semblance of a normal life -- unsung heroes, every one, and Willis brings them vividly to life. I've been a fan of Willis' work since I first read The Doomsday Book many years ago. It was the first set in her futuristic world of time-traveling historians, following the invention of something called "The Net."
Any lover of history has fantasized about what it would be like to actually visit past eras, and in this world, they can do just that. But there are rules, most notably, the historians can't affect the course of events -- or, as Lost's doomed physicist, Daniel Faraday, phrased it, "Whatever happened, happened." The spacetime continuum has a number of ways of protecting itself from such an occurence, including something called "slippage": the Net won't send a historian to a time and place where s/he could affect the outcome, and will basically over-ride the programming, sending the historian to the nearest time and place where s/he can have no impact. Oh, and you also can't take objects from the past through the Net into the future -- unless they were destroyed in the past, a twist in Willis' fictional world rules that showed up in her second novel set in this world, To Say Nothing of the Dog.
Blackout and All Clear give us another twist on Willis' rules of time travel, and it's a doozy: the slippage factor is getting progressively worse, and seems to be centered on the critical events in World War II London between 1940 and 1944. Temporal physicists are beginning to worry that perhaps their assumptions about time travel have been wrong, and it is possible to affect the course of historical events -- something that would be disastrous for a period like the one in question, where the outcome of the war literally balanced on a knife point at several junctures over that four-year period. Could one of their historians inadvertently have altered the outcome of World War II? When four historians find themselves trapped in the past, everyone's worst fears appear to be realized. And that's as much as I can say without spoiling the fun. Like all Willis' novels, there is humor, pathos, and gut-wrenching suspense, and at some point she will break your heart. There's a lot of disparate threads in these two books (actually one book split into two), but Willis is a master weaver and pulls it all together in the end.
Of course, even if we can't change the past, who can say what might happen if historical figures showed up in the future:
Occasional co-blogger Allyson is being far too modest about some very big news: she's the proud author of a new, scienc-y children's book from Conservatory Press, The Amazing Adventures of Sam the Bat. It relates the inspiring tail of a young free-tail bat who gets separated from his home colony and must find his way home -- a journey that takes him from South American rain forests, to a five-star hotel in London, and even to Notre Dame in Paris. (This is actually Allyson's second book, but her first stab at fiction. Her first was a collection of essays ruminating on the online culture of fandom with her trademark caustic wit: Will the Vampire People Please Leave the Lobby?)
I've watched Allyson sweat over this book for over a year: researching bat science and trying to convey those details accurately, while still telling a gripping story that will fire kids' imaginations. It's not an easy feat. Sure, I'm biased, but I think she's succeeded with flying colors. And Those Who Blurbed agree with me:
"Sam the Bat is a delightful book that manages at once to teach children about a fascinating and greatly misunderstood species, while holding them under the spell of a touching -- and often very funny -- story with an appealing hero. I'm sorry I didn't get to read it to my own children." -- Peter S. Beagle, author of The Last Unicorn and Mirror Kingdoms
"Sam's story is a great introduction to the lives of bats around the world, and is a thrilling read. It's a great challenge to imagine life through the eyes of a bat (I've tried), and Allyson Beatrice does so beautifully. Through Sam, Beatrice explores the kindness of strangers, the importance of friends, and the value of family. I would recommend this book to any young person curious about the world and the animals living in it." -- Daniel K. Riskin, assistant professor of biology at the City University of New York, and from Animal Planet's Monsters Inside Me and Discovery Channel's Curiosity
So, in honor of this momentous occasion, I'd like to offer a few batty links for your reading pleasure. Allyson's not the only Batgirl at the cocktail party, in fact --I've blogged about the acoustics of echolocation, and how that basic science can feed into helping develop prosthetic devices for restoring some semblance of sight to the blind. (I also follow @God_Damn_Batman on Twitter, just for laughs.) Know who else loves bats? Ed Yong of Not Exactly Rocket Science. Here's a sampling:
Ninja Bat Whispers To Sneak Up on Moths. "Holger Goerlitz from the University of Bristol has found that the barbastelle bat is a stealth killer that specialises in eating moths with ears. Its echolocation calls are 10 to 100 times quieter than those of other moth-hunting bats and these whispers allow it to sneak up on its prey."
Bats, Compasses, Tongues, and Memories. "If you were a biologist looking for astounding innovations in nature, you could do much worse than to study bats. They are like showcases of nature’s ingenuity, possessing a massive variety of incredible adaptations that allow them to exploit the skies of the night."
Then there's today's hot bat-related news in the blogosphere: How Bats Find Water and Why Metal Confuses Them. "Waves of sound that hit the surface of still water would generally bounce away, except for those aimed straight downwards. Stefan Greif and Björn Siemers from the Max Planck Institute for Ornithology have found that bats are instinctively tuned to find water using this unique feature." (I love the Max Planck Institute for devoting an entire institute to the study of bats.)
And who could forget this classic news story (with exciting NSFW bat porn video footage!) about certain batty practices that would appall Chrisine O'Donnell: Holy Fellatio, Batman! Fruit Bats Use Oral Sex To Prolong Actual Sex (no need for an excerpt, it's self-explanatory). We're sure Allyson's innocent little bat friend, Sam, would never, ever, engage in such behavior. *cough* He's a free-tail bat, not one of those pervy fruit bats!
Ed has also written about how caribbean fruitbats are kind of a mash-up of three different bat species; how wind turbines pose a threat to bats;, and their evolutionary common ground with whales when it comes to echolocation. Really, if you Google "science of bats," you can't help but come across something written by Ed on the topic. (Let's start a rumor! Ed Yong: Bat Fetishist.)
But he's not the only one. Here's a lovely rumination on the bat as "Life In Motion" from The Loom's Carl Zimmer. The Featured Creature recently posted OMG-Adorbz! pix of fluffy Honduran baby white bats. Over at PLOS Blogs, Brandon Keim warns about what would happen in a world without bats. And even the San Francisco Chronicle got into the game with a recent article, "Dispelling Flights of Fancy About Bats." So celebrate the release of The Amazing Adventures of Sam The Bat by boning up on a litlte bat science! (And feel free to order Allyson's book when you're done!)
Sometimes one makes spur-of-the-moment media decisions that do not, in the end, prove worthwhile. A couple of days ago I received the following email from one of the producers of the Mancow Radio Show:
The Mancow Radio Show invites you to join Mancow on the air for a brief guest phone interview to promote your article "Big Game Theory" in Discover Magazine, discuss physicists and poker, and promote your additional work. The Mancow Show is based out of Chicago and nationally syndicated to millions of listeners, with high ratings in top markets across the nation. Mancow is a passionate, opinionated, culture-engaging, and patriotic truth-seeker who thrives on quick-witted conversation with intriguing guests.
Sounds pretty good, right? I love quick-witted bantering, and am happy to do my part to promote Discover and similar publications (including my own books, natch). Now, granted, the reality is that Mancow is pretty much your average right-wing shock jock. The wit is limited to cheap shots about "National Politically Correct Radio", how annoying wives are, and the occasional off-color gag. In between is the usual ranting about "real" America, picking yourself up by your own boostraps, how homeless people choose to be that way because they're all drug addicts, and trumpeting one's right to shout as loudly as possible to drown out any potentially opposing views -- and maybe stomp on a few heads of dissenters just for larks. Because, you know, free speech is only for those who agree with you. But what the heck, I was still game -- Mancow's just playing to his core audience. That doesn't mean we can't have a good exchange. It's poker, after all. Everyone loves poker, Republican, Democrat, Libertarian and moderate Independents like me. How could that be politically charged? It's always possible to find common ground when everyone's acting in good faith.
Sadly, not everyone acts in good faith. My expectations were pretty low going in, and Mancow didn't even meet that lowest of bars. Here's the gist of our "conversation" this morning, which lasted maybe all of one minute:
Mancow: Hey there, so what's this Discover magazine thing? Related to the credit card company? I subscribe to lots of magazines, never heard of it.
Me: Oh, well, it's a science magazine. You can find it in lots of airports.
Mancow: Uh-huh. So what's this Big Game Theory story??
Me: (gamely attempting to be chipper and upbeat at 6:30 AM) It's about poker-playing physicists! Turns out there's quite a lot of them who are finding that poker is a challenging, intriguing game.
Mancow: So they're, like, card-counting and stuff to make money off the casinos....
Me: Actually, no, poker is more about the probabilities, game theory, human psychology, and physicists find that --
Mancow (interrupting): Oh, guess I'm thinking about blackjack. I'm a blackjack man. That's a MAN'S game. <click>
And he cut me off. Yep, I got the Bill O'Reilly treatment for the crime of Not Fitting the Right-Wing Narrative. Over poker. I have no idea what Mancow thought this was supposed to be about -- he mentioned card-counting, so maybe he was thinking about those MIT geniuses a few years ago who tried to rip off casinos while playing blackjack? Maybe he was annoyed because I didn't play along and kept earnestly assuming this was a real interview (my bad)? It doesn't matter, because the import of his final comment was clear: those high-falutin', pointy-headed intellectual elites, with their fancy math and their strategy and their stupid human psychology and their knowing-what-the-fuck-they're-talking-about -- they are Not Real Men (TM). (Corrollary: Real Men (TM) do not read Discover magazine.) At least he didn't try to make lame sex jokes like he did with the previous (male) caller.
Look, it's not like Mancow ruined my day; he just wasted my time, and not very much of it. So I just shrugged and went back to bed to grab an extra hour of sleep. But it was one more encounter with rampant anti-intellectualism that's been popping up in my life over the last few weeks, starting with a long article in the Guardian about "Miracles," the controversial music video by Insane Clown Posse. (I wonder if Real Men (TM) listen to Insane Clown Posse. Can they read the Guardian?) That's the one with the classic lines,
"Fucking magnets/How do they work?"
And I don't wanna talk to a scientist
Y'all motherfuckers lying and
Getting me pissed
Never mind that anyone with access to the Internet and the ability to type keywords into Google can easily discoverhow magnets work. Or gravity. Or that electromagnetism and our ability to not only understand it, but control it, is why Insane Clown Posse can make a music video and release it on the Internet in the first place -- and why Mancow has a career in radio broadcasting. Maybe what they should be saying is "Thank you, Science, for creating our jobs!"
Understandably, most people with a lick of common sense found these sentiments ridiculous, and weren't shy about saying so. Saturday Night Live even lampooned Insane Clown Posse with this:
The skit makes the point better than I could in 10,000 words. ("Isn't a volcano just an angry hill?" Um, no.) Much has been made of Jon Ronson's entertaining Guardian article and their anti-science rants -- not to mention the fact that they claim to be evangelical Christians. (Remember in the Gospels, how Jesus went around cussing and inciting his followers to violence, just so he could win their trust and then BAM! bring them to the Lord? Actually, I'd wager their faith runs about as deep as their lyrics.) Apparently, Shaggy 2 Dope and Violent J are very sensitive, prone to depression, and all this mocking has hurt their delicate artistic feelings. They are Misunderstood. See, they thought they were being deep and philosophical and stuff when they wrote "Miracles." Gravity and magnets are cool and all but don't mess up their sense of wonder with actual understanding!
I've got news for the Insane Clown Posse. All this, "Whoa, dude, check out that giraffe! Aren't the stars awesome?" nonsense? That's not deep. Or philosophical. And it's definitely not wonder. That's -- well, frankly, that's called "being high." ("Fuckin' Ecstasy/How does that work?")
Real wonder is something quite different; it can't be diminished or killed by science; it is only enhanced by science. It reminds me of one of my favorite episodes of House, where Dr. House is debating a magician who has refused to tell House how he achieved a certain trick. "If I tell you, then you'll lose the actual magic," the magician protests. House replies, "Magic is cool. Actual magic is an oxymoron. Possibly only moron." Later in the episode, the magician explains that people come to his shows to feel a sense of mystery and wonder, and knowing how a trick is done would spoil that sense of wonder. It's the same basic message as Insane Clown Posse -- just more articulately phrased. House's rejoinder should become the mantra of everyone who values rationality and critical thinking: "If the wonder is gone when the truth is known, then there never was any wonder." When I learn about the underlying science of something, it makes me wonder more -- not less -- and deepens my appreciation for the beauty and mystery of the world.
Shaggy and J can take comfort in the fact that they're clearly not alone in their sentiments. The anti-vaccination crowd is still misleading well-intentioned parents trying to do the right thing by their kids, urging them to trust their "gut instincts" rather than medical science. The wackjobs at Conservapedia are working hard to prevent right-leaning folks from encountering any uncomfortable "facts" by setting forth their own (laughably ignorant) take on Einstein's special and general relativity -- apparently part of some socialist Obama-genda (snort) -- and bashing the prestigious Fields Medal in mathematics because sometimes the recipients vote Democratic, or hail from communist countries, and therefore must be punished for their blasphemy. New York magazine just ran an article making fun of the titles of various math courses, clearly demonstrating they didn't know the first thing about what those courses actually entailed (hint: topology is hardly "math for jocks"). It all adds up to a very vocal minority, fueled by incoherent rage, with the potential to do a great deal of damage -- particularly when it comes to things like vaccines.
It's really been getting on my nerves lately. We had dinner last night with our friend Carol Tavris, a psychologist and author (Mismeasure of Woman, Mistakes Were Made... But Not By Me) who writes about denialism, cognitive dissonance, confirmation bias, memory, decision-making, and all the other oh-so-many ways we trick ourselves into reinforcing our most cherished beliefs, because it's just too damned uncomfortable to admit otherwise. She's one of my favorite people: she's tough, scrupulously honest, and has the sharpest bullshit-detector I've yet encountered. Carol will never tell you the comfortable lie you most want to hear -- but she will be compassionate when dealing out the painful truth. I told her how, several months ago, an interviewer asked me why surveys showed that "scientist" is one of the most trusted professions, and yet there is such a strong anti-science rhetoric sweeping the mainstream media. I didn't have a good answer at the time, but now I do. It's this:
Science has a lot of fair-weather friends. People love science so long as it's wowing them with cool nifty insights or bringing awesome new gadgets and technology to market. But sooner or later, science -- by its very nature -- is going to tell you something you don't want to hear. It's going to challenge an easy assumption, or a deeply-held belief. It's going to make you question your personal reality that you've so carefully constructed up to that point. And that's the acid test. That's when you find out if you truly love science, if you're a genuine seeker of Truth and Wonder, or just someone who's content with the cheap ersatz substitutes. You can choose Option A: recoil against the truth and shoot the messenger, metaphorically speaking, by demonizing him or her and everything science stands for. Or you can choose Option B: grudgingly accept that you might be wrong, and if those facts turn out to be true, changing your beliefs and behavior accordingly.
Sadly, the vast majority of people choose Option A: it's easier, less work, less discomfiting to never challenge one's assumptions -- and therefore never change. Change is scary. Change is hard. And our brains are hard-wired against accepting new viewpoints once we have a "framework" in place. We live in an amazing era when information is available to just about everyone -- much of it free -- and there's simply no reason to remain willfully ignorant of well-established facts (things that are a matter of opinion are a different beast altogether). Do we take advantage of that tremendous gift? Or do we only seek out sites and facts that reinforce our biases? Do we twist what we read and hear to fit our pre-existing framework, and attack any source that contradicts it? If we're human, our default mechanism is bound to be Option A, unless we're very self-aware and work to counter those natural instincts.
And no, scientists are not immune to this. However, the best means of combating it is, in fact, training in science, logic, critical thinking. The process of science is carefully structured to account for human bias, and remove it from the equation, so to speak. Lots of physicists hated the implications of quantum mechanics when it was first proposed in the early 20th century. But when experiment after experiment confirmed the theory, they had no choice, as scientists, but to accept that yes, at the subatomic level, this really is how Nature works. Because they did so, you are now reading this on a computer monitor or electronic gadget, made possible by quantum mechanics. In the same way, you have to work at being self-aware, conscious of your biases. I work really hard at it, and I'm only partially successful. But at least I'm trying. I'm not taking the path of least of resistance. And I work equally hard to share my journey with others, in hopes of inspiring them to do the same. So yes -- it's sometimes depressing to see and hear that cacaphony of willful ignorance, day after day. It led to the following exchange with Carol:
Me: So, is there any hope?
Carol: No. (seeing my chagrin) But doesn't that make you feel better? You don't have to keep beating your head against the wall!
See? I told you she doesn't mince words. It didn't really make me feel better -- but she told me the truth. Then I came home and found a couple of lovely emails, from complete strangers, who'd read my work and been inspired to learn more about math and science -- two more candles to offset the darkness of willfull ignorance. Thanks to those people for choosing real wonder, and for making me realize my efforts aren't for naught. Sometimes, when you bang your head against the wall, you achieve the occasional tiny crack. It's not much -- but it's enough.
ADDENDUM: Carol admits that she was half-teasing me with the answer "no", and assures me there is always hope, although the challenge is indeed a daunting one: "As Stephen Jay Gould said to me when I asked him a similar question: Don't you ever feel like Sisyphus? Steve: Yes. But how much further down that mountain we would be if we didn't keep pushing the rock upwards."
Exciting news broke earlier this week, at least for fans of Discovery Channel's Mythbusters (and oh yes, Jen-Luc Piquant is a mega-fan!). President Obama will make a special appearance on the December 8 episode of the series -- part of ongoing efforts of the administration to promote science, technology and engineering (STEM) education, starting with the Educate to Innovate campaign launched in 2009. In fact, he made the announcement during the first ever White House Science Fair. We are currently taking bets on how quickly Faux News and its noisy acolytes will start braying about how all this "promoting science to a broader audience" is really just a commie/Socialist plot to forcibly redistribute the wealth knowledge to the undeserving Ignorant. Or Muslims.
No doubt adding fuel to the fire, the episode in question will revisit the "myth" of the Archimedes "death ray." For those who don't recall the story, the brilliant Greek mathematician, Archimedes of Syracuse, was also known for building ingenious weapons of war to defend Syracuse from the invading Roman army. There was, for instance, a giant crane capable of capsizing ships, known as the Claw of Archimedes. Another such invention, legend has it, was a large curved parabola-shaped array of mirrors capable of collecting and focusing the sun's rays onto the Roman ships moored in the harbor, laying siege to the city. The heat caused the ships to catch fire and burn, and Syracuse was saved -- for awhile, at least. (Eventually the Romans overcame the city's defenses, and Archimedes was killed in the ensuing chaos. Set mathematics in Western Europe back a good 700 years, at least.)
Long before the Mythbusters appeared on TV, folks were trying to ascertain the validity of that legend. Skulls in the Stars has a classic post detailing the history of such attempts to test similar devices, usually with mixed results. Back in the 18th century, the noted naturalist, he Comte de Buffon (who also devised the "Buffon's Needle" puzzle) assembled an array of ordinary mirrors (40 in all) and managed to set a log pf tarred beechwood on fire from a distance of 66 feet. The more mirrors he used, the more effective the technique was at setting fires from a distance. With 128, he could set fire to a plank of tarred fir from a distance of 150 feet.
Thre was also an article by one John Scott in the late 19th century assessing the evidence for and against the effecitveness of an Archimedes "death ray." It appeared in the Proceedings of the Royal Society of Edinburgh. Scott was more skeptical than Buffon, noting that historical accounts closer to the time of Archimedes say nothing of burning mirrors, Livy and Plutarch among them. But it's a fun story, and I used it to talk about parabolas and finding the area under a curve in the first chapter of The Calculus Diaries. Some legends are worth repeating, whether or not they turn out to be true.
The Mythbusters have already tackled this challenge twice already: once on their own, and the second time with the help of team of scientists from MIT. Conclusion: it's most likely a myth, although in principle it's feasible. The MIT experiment managed to start a small fire on a wooden ship, although it quickly burned out. Considering the time it would take to set fire to a ship using such a technique -- think of how long it took when you, as a kid, tried to set a piece of paper on fire with a magnifying glass on a hot summer day -- flaming arrows would probably be more efficient. But I guess President Obama asked them to revisit the challenge. Why? Who knows? Maybe he's keen on getting a nifty death ray for the White House. (Cue mass hysteria from the paranoid fringe!)
Personally, I'd like to see the Mythbusters tackle a related challenge: the purported "death ray" that strikes poolside at the newly built Vdara Hotel in Las Vegas. It's part of the City Center complex, and the building has a distinctive parabola-like shape. Therein lies the problem. According to recent news reports, a vacationing lawyer was relaxing poolside at the Vdara, when he started to feel very warm, and then smelled something burning. It was... his HAIR! HIS HAIR WAS SMOKING! He jumped up from his seat and doused his head in the pool, then repaired to the bar for a stiff drink. The bartender nodded knowingly when he descibed his plight: "Yeah, we call that the Death Ray." The Las Vegas Review Journal published this helpful schematic to illustrate the principles at work:
I guess there was a reason no one was sitting in what would otherwise be a prime poolside seat. When the lawyer went back to retrieve his newspaper, he found the plastic in which it was wrapped had melted.
But I'm just a tad bit skeptical. I mean, check out this photo of the alleged newspaper:
It clearly spells out the word "VDARA." I smell a hoax. How did the sun's rays manage to carve out just those letters? Was there a "stencil effect" at work, i.e., over one of the curved windows? Inquiring minds need to know! And the Mythbusters are known for their inquiring minds and ingenious experiments. We eagerly await their findings.
UPDATE: Several commenters -- thanks, guys! Knew I could count on you! -- pointed out that the plastic bag itself was stenciled with black letters,the black absorbs the sun's heat faster, and hence the bag melted in just that pattern. Science! Also? We have been called "fluffy" in the comments section. We consider this a compliment. Jen-Luc Piquant humbly suggests that if you're looking for a detailed explication of this effect, with fancy diagrams and equations and all, a blog that proudly calls itself Cocktail Party Physics probaby isn't your best bet. Do that sort of thing at a cocktail party and you'll soon find yourself alone in a corner, doodling on a napkin and talking to the catering staff (who are paid to be there and already bored), while the other guests avoid you like the plague. Just sayin'. (Except if it's a cocktail party with physicists, in which case it's good form to provide a white board.) However, you can find just that sort of thing over at the most excellent blog, Dot Physics (formerly of SEED Science Blogs, now housed at Wired), and we thank said commenter for the link. Check it out!
The perfect pick-me-up when gravity gets you down.
2 oz Tequila
2 oz Triple sec
2 oz Rose's sweetened lime juice
7-Up or Sprite
Mix tequila, triple sec and lime juice in a shaker and pour into a margarita glass. (Salted rim and ice are optional.) Top off with 7-Up/Sprite and let the weight of the world lift off your shoulders.
Listening to the Drums of Feynman
The perfect nightcap after a long day struggling with QED equations.
1 oz dark rum
1/2 oz light rum
1 oz Tia Maria
2 oz light cream
Crushed ice
1/8 tsp ground nutmeg
In a shaker half-filled with ice, combine the dark and light rum, Tia Maria, and cream. Shake well. Strain into an old fashioned glass almost filled with crushed ice. Dust with the nutmeg, and serve. Bongos optional.
Combustible Edison
Electrify your friends with amazing pyrotechnics!
2 oz brandy
1 oz Campari
1 oz fresh lemon juice
Combine Campari and lemon juice in shaker filled with cracked ice. Shake and strain into chilled cocktail glass. Heat brandy in chafing dish, then ignite and pour into glass. Cocktail Go BOOM! Plus, Fire = Pretty!
Hiroshima Bomber
Dr. Strangelove's drink of choice.
3/4 Triple sec
1/4 oz Bailey's Irish Cream
2-3 drops Grenadine
Fill shot glass 3/4 with Triple Sec. Layer Bailey's on top. Drop Grenadine in center of shot; it should billow up like a mushroom cloud. Remember to "duck and cover."
Mad Scientist
Any mad scientist will tell you that flames make drinking more fun. What good is science if no one gets hurt?
1 oz Midori melon liqueur
1-1/2 oz sour mix
1 splash soda water
151 proof rum
Mix melon liqueur, sour mix and soda water with ice in shaker. Shake and strain into martini glass. Top with rum and ignite. Try to take over the world.
Laser Beam
Warning: may result in amplified stimulated emission.
1 oz Southern Comfort
1/2 oz Amaretto
1/2 oz sloe gin
1/2 oz vodka
1/2 oz Triple sec
7 oz orange juice
Combine all liquor in a full glass of ice. Shake well. Garnish with orange and cherry. Serve to attractive target of choice.
Quantum Theory
Guaranteed to collapse your wave function:
3/4 oz Rum
1/2 oz Strega
1/4 oz Grand Marnier
2 oz Pineapple juice
Fill with Sweet and sour
Pour rum, strega and Grand Marnier into a collins glass. Add pineapple and fill with sweet and sour. Sip until all the day's super-positioned states disappear.
The Black Hole
So called because after one of these, you have already passed the event horizon of inebriation.
1 oz. Kahlua
1 oz. vodka
.5 oz. Cointreau or Triple Sec
.5 oz. dark rum
.5 oz. Amaretto
Pour into an old-fashioned glass over (scant) ice. Stir gently. Watch time slow.
Recent Comments