A fictionalized Albert Einstein (portrayed by the late Walter Matthau) plays mischievous matchmaker between his egghead niece, Catherine Boyd, and a good-hearted auto mechanic named Ed Walters in the charming 1994 romantic comedy, *I.Q*. Some might object to the considerable liberties taken with historical fact and illustrious personages. After all, a key plot point turns on Einstein essentially committing a form of academic fraud by passing off his own work as Ed's, to help the latter impress his niece. But there's a lot to admire in the film, if for no other reason than its inclusion of Einstein's real-life cronies, Kurt Godel, Boris Podolsky, and Nathan Liebknecht as supporting characters. And how could you not love this exchange, as Ed introduces Einstein to Frank, one of his co-workers at the garage: "This is Albert Einstein, the smartest man in the world!" Intones Frank (while shaking hands) in his best Joisey accent, "Hey, how they hangin'?"

This being a mainstream movie, there's very little actual science portrayed, despite the predominance of scientists in the cast. But there is a lovely little scene in a diner, where Catherine tries to explain to Ed the gist of Zeno's Paradox. She explains it thus: if she takes one step forward, and then halves the distance traveled with her next step, then halves it again, and so forth, such that the progression goes on for infinity, she will never be able to reach Ed. The distance between them will get smaller and smaller, but will never reach zero. The astute viewer will note that the subtext here is Catherine's belief that there is no way to bridge the gap between the couple's intellectual differences and social status. Which makes it all the more refreshing for us diehard romantics when the practical-minded Ed simply steps over the imaginary line to close the gap: "So how did I do that?" A confused Catherine stammers, "I... I don't know." But if she knows her calculus (and she should), the "mystery" should be easy to solve.

I've encountered numerous versions and variations on Zeno's Paradox over the years, but -- and it pains me to admit this publicly -- I did not realize it was tied to the essence of calculus. If I get nothing else out of my fledgling experiment in self-instruction regarding calculus, at least I have learned that much. For those who haven't encountered Zeno before, he was a Greek philosopher living in the 5th century B.C., who thought a great deal about motion. Some might argue he thought a bit __too__ hard about motion; the guy was always playing devil's advocate, even with his own arguments, which is how he arrived at his eponymous paradox. He used an arrow flying through the air towards a target to illustrate his points, rather than a young couple in a diner, but the basic idea is the same: to reach the target, the arrow must first cover half the distance, then half the remaining distance, and so on, moving an infinite number of times. By that logic, the distance between the arrow and the target would keep getting smaller and smaller, and yet the arrow could never close the gap completely in order to actually __reach__ the target.

There's an equally paradoxical corollary: At any given moment in time, the arrow has a specific fixed position -- it can only be in just one place at any given time -- which means it is technically "at rest" (not moving) at that particular instant, even though, taken all together, those individual points add up to an arrow in motion. Motion, after all, is basically the measure of how an object's position has changed over time. But break down motion into infinitely small increments, and you find yourself trying to determine how far it traveled in zero amount of time: instantaneous motion. Ergo, the paradox. Zeno stumbled on a new way of looking at the world. Alas, it would take a couple of millennia before mathematics developed sufficiently to make sense of his logical conundrum. (For starters, they needed the concept of zero.)

Calculus has a rather formidable reputation. I have always been among those non-mathematical sorts who viewed it with intense trepidation and preferred to keep a safe distance, despite my love for all that science-y stuff. But according to my new virtual instructor, Michael Starbird (University of Texas, Austin), the entire discipline is encapsulated in two fundamental ideas: (1) the derivative, which is a way of measuring instantaneous change (such as finding speed from position); and (2) the integral, which describes the accumulation of tiny pieces that add up to a whole (and can be used, for instance, to determine the distance traveled based on speed). Everything else involved in calculus is just variations on these two themes.

Naturally, any physicist reading this post already knows this stuff. But it wasn't always the case. Borrowing one of the central concepts of the discipline itself, one
could argue that calculus was invented via tiny infinitesemal bits of
accrued knowledge that, taken together, added up to a whole. (Isn't metaphor a
marvelous thing?) The roots of calculus are usually traced back to ancient Greece and the Pythagorean theorem, a century or so before Zeno. A century __after__ Zeno, Eudoxus developed something called a "method of exhaustion," which made it possible to determine the area and volume of regions by breaking them into smaller shapes. Archimedes later adapted the method of exhaustion to determine the areas and volumes of geometric objects around 225 BC.

A lot of mathematicians over the subsequent centuries -- in lots of different geographic regions, including India and Japan -- made vital contributions, and a handful "almost" invented calculus, most notably Pierre de Fermat in 1629. But ultimately, the credit for inventing calculus is given jointly to Isaac Newton and Gottfried Wilhelm Leibniz, who independently made their revolutionary discoveries in the 1660s and 1670s.

Newton hardly needs an introduction, being almost universally recognized as the father of modern physics via his work on gravity and the laws of motion (detailed in the massive *Principia*) and the nature of light (*Opticks)*. Around 1666, Newton worked out his so-called "theory of fluxions." It's early days yet in my remedial calculus DVD course, and I have yet to find a good lay person's explanation of what Newton's theory entailed, but it seems to correlate pretty well to the notion of differentials. At any rate, it's generally agreed that Newton was the first to state the fundamental theorem of calculus, and was also the first to apply derivatives and integrals in a single work (although he didn't use those terms).

Leibniz is less well-known to non-scientists. He was born in Germany in 1646, and since his father died when he was 6, was largely raised by his mother. He taught himself Latin and Greek so he could read the great works of Aristotle and other philosophers. He entered the University of Leipzig at age 15, and left two years later with his degree in law (he eventually earned a doctorate in law).

A chance meeting with Christopher Huygens ignited Leibniz's interest in the study of geometry and the mathematics of motion; he described their meeting as "opening a whole new world" to him. He pursued these interests in his spare time, inventing (in 1671, well before Charles Babbage and his Difference Engines) a handy little machine called the step reckoner. A forerunner of the modern calculator, the device could add, subtract, multiply, divide, and even extract square roots. His reasoning: "It is unworthy of excellent men to lose hours like slaves in the labor of calculation, which could be safely relegated to anyone else if machines were used."

But it was the problem of motion that most intrigued Leibniz, who published the first account of differential calculus in 1684, followed by a discussion of integral calculus two years later. Newton's work on the subject didn't appear in print until 1687. His procrastination led to one of the most bitter controversies in scientific history. Using present-day practices, Leibniz would have won credit for the discovery, simply because he published first, but at the time, Newton was by far the more famous scientist, and a prominent member of the Royal Society. And he wasn't above using his considerable influence to crush the scientific competition. In addition to Leibniz, he fought with John Flamsteed, with Huygens , and with Robert Hooke, and each proved to be an acrimonious battle. In short, Newton was not a "people person;" no wonder he purportedly died a virgin.

The Royal Society sided with Newton on the controversy, crediting him with the discovery of calculus in 1715 after a prolonged dispute. Leibniz wasn't given shared credit until after his death a year later, and even then, it was one of those never-ending controversies that occasionally plagues the physics profession. (Get a couple of physicists on opposite ends of the "does centrifugal force really exist" debate, then sit back and watch the fur fly.) Today, the consensus seems to be that the two men represent the two approaches that form the basis of the discipline they co-invented. Leibniz was the more abstract of the two, much like *I.Q.*'s Catherine -- and it's his superior system of notation that modern scientists still use today -- while Newton focused on the more practical applications of calculus, like Ed the science-minded garage mechanic.

So that's the gist of what I learned this week. Calculus is fairly simple and straightforward in concept; the devil is in the details. But essentially it's a way of measuring change, whether that be a change in position, temperature, or what have you. Its power comes from its universality: the same basic concepts can be applied to systems as diverse as a car driving down a road, the stock market, even traffic flow.

Personally, the most striking thing about my first calculus lesson was the notion of the integral, because I hadn't considered the deeper implication of this seemingly obvious statement: viewing objects as being formed via the accumulation of infinitesimally small pieces enables us to see the world as a dynamic rather than static place. A simple medicine ball, viewed through the lens of the integral, can be seen as growing by accretion -- i.e., a form of motion -- rather than just sitting there complacently on the floor of my gym, waiting to be of some use.

It's a whole new way of thinking, but at least I've got the gist of the "What", in the broadest possible brushstrokes -- plus a useful historical context. All those devilish details will be filled in bit by bit in the coming weeks as I begin the hard part: working through the equations and learning how to apply the universal principle to a wide variety of applications. The biggest challenge, for my abstractly-challenged brain, will be grasping the over-arching "Why" -- the contextual framework into which it all fits. Because ultimately, the "why" is everything. In this case, the "why" comes down to the function: the derivative and the integral are flip sides of the same coin, two different ways of looking at the same thing, and the function is the connection between them. Understanding how the two relate to each other, and the significance of that dependence, as expressed in a universal "Fundamental Theorem," is my ultimate goal.

In the meantime, my 14-year-old niece, Kathryn, sent me a little Internet meme in which one can determine one's age using "chocolate math." It's significantly easier than calculus:

1. Pick the number of times a week that you would like to eat chocolate. (This number must be more than 1 but less than 10, much to Jen-Luc Piquant's disappointment, as she would like to have chocolate for breakfast, lunch and dinner, 7 days a week -- provided it's gourmet, organic chocolate, that is; none of that cheap-o Hershey stuff.)

2. Multiply this number by 2.

3. Add 5.

4. Multiply the total by 50.

5. If you have already had your birthday this year, add 1756. If you haven't, add 1755.

6. Now subtract the four-digit year in which you were born.

Ta-da! The result should be a three-digit number. The first digit is the original number you chose, pertaining to how often per week you would like to eat chocolate. And the next two numbers are your age (one assumes anyone capable of taking the quiz is aged 10 or older, otherwise they'd end up with a two-digit number). I can faithfully report that it really does work, a nice little example of "mathemagic," or hidden patterns in numbers. Apparently 2006 is the only year this particular trick will work, although I'd think that altering the numerical values in Step 5 might make it possible to adapt the meme to other years. That doesn't mean I understand to the nth degree __why__ it works, of course, but I'm pretty sure it has nothing to do with any mysterious innately mathematical properties of chocolate (more's the pity).

Teenagers today have unprecedented access to cutting-edge technology, but they rarely use email, cell phones, MP3 players, or computers for much more than entertainment. So I was thrilled to get a mathematical meme (however elementary) from my niece, and not, say, a giggly personality "quiz" or cautionary chain letter, which is more the norm for kids her age. Maybe she'll manage to enjoy her high school calculus class senior year -- unlike her recalcitrant aunt, who skipped out on it altogether. Then again, maybe I'm finally mature enough to appreciate why calculus is so seminal to science.

What a fun essay! It's nice to see calculus through the eyes of a new scientist, and with the words of an experienced writer.

But the math trick isn't hard to figure out, if you know how. It isn't calculus, it's just algebra. Write it out as an algebraic equation and the trick becomes pretty obvious.

Let X be your guess. Then you write out what the steps tell you to do:

(2X + 5) * 50 + 1756) - birth year

I'm just doing the first case of having already had your birthday this year.

Now work the algebra through to get

100X + 250 + 1756 - birth year

to

100X + 2006 - birth year

But (2006 - birth year) is your age! Note: if you haven't had your birthday yet, the age you get is wrong... which is why it says to add 1755 if you haven't had it yet.

100X gives you your number first, plus two zeros. When you add your age to that, you get a three digit number where the first is your guess, and the second two are your age.

So you were right-- 2006 is needed; for next year the numbers will have to change.

You can also play with the numbers a lot. Anything that yields the current year after you play with it will work.

I remember always being surprised by these number games, and then not too long ago my sister asked me about one. I realized it was just algebra and worked it out. Kind of a "duh" moment in retrospect, but I think it was more fun to see how it worked than it was to be amazed at it. I feel that way about everything. It's more fun to understand!

Posted by: Phil Plait, aka The Bad Astronomer | September 30, 2006 at 01:44 AM

You should teach Calculus once you're done learning it.

I think a lot of students who learn calculus learn algorithms for solving problems without really understanding deeply what it's all about. Indeed, I think that even happens with algebra -- students learn in high school how to plug and chug, but I'm not sure they have a deep understanding of what it really means to have a variable representing some unknown quantity, and why many of the various "rules" for what you can do in algebra are reasonable.

A deep understanding of what both of those are all about can only help in remembering and properly using the rules for performing the mechanics of it. But, beyond that, it's necessary if you're really going to use math to help understand science. Being able to do algebreic manipulations is one thing -- by and large, students in my non-majors intro astro course are very good at that. (I don't put too much emphasis on the equations, but there is some.) Where they have trouble is seeing that the equations aren't just a game you play to get the answer to a problem, but that they are a way of expressing the concepts that we're really working on-- and vice versa.

-Rob

Posted by: Rob Knop | September 30, 2006 at 10:25 AM

It's nice to read an essay about Calculus from a new scientist, to me, because when I learned Calculus a couple years ago at a community college, I learned it from a teacher who only did math and showed signs of being truly uncomfortable with the scientific mindset. On a few problems we had to work on, examples were often of course basic physics problems solved with Calculus rather than kinematic equations. Every time we would try to include the units during discussion or on the marker-board, he'd get all fussy and tell us to wait until the end and then write the units "because then you just get'em for free."

When asked why he's so persnickety about units, he went on a little rant about how math theorems can be proven absolutely so he's always been more comfortable with math rather than physics and the units bother him.

Basically he was saying that the units (And including the units in every step of a problem) reminded him of how uncertain truth in the real world is. He was actually telling us that he's uncomfortable with dealing with facts and would rather deal with the absolute rules on the chalkboard/marker-board than the approximation process that using physical units entails, like the real world.

He may be a sissy, but at least he realizes that hard truths are not found in reality and accepts it, unlike other professional mathemeticians like William Dembski, who have such an attachment to the absolutist mindsets they've attached math to that they do not understand that math models that do not reflect the flexible properties of the real world are not valid reflections of it.

If only my Calc. 1 teacher did understand that Calculus is a real answer to real questions like Zeno's Paradox. I think he could be a good phycisist.

Posted by: Aerik | September 30, 2006 at 07:55 PM

It's fun to vicariously relive the enjoyment of getting calculus.

I want to throw in a comment on fluxions. They are equivalent to Leibniz's tools, but it's a question of where you place the focus. Leibniz implicitly began in the framework of analytic geometry, where he has coordinates, and thought in terms of changes of those coordinates along graphs of functions.

With Newton, he starts with a line, the path of an object moving through space without external influence. Then he says that if the path influenced, then there is some vector (well, line segment in his world) that gives the result of that influence, namely the vector between the undisturbed path and the disturbed path. This is the fluxion.

The two need not be perfectly equivalent. You can define a fluxion in a curved spacetime perfectly well (though Einstein told us to forget that: once you've curved the space, just throw the fluxion into the curvature as well and have done). On the other hand, the Leibniz ratios of differentials of coordinates don't always play nicely in such situations. The fluxion is defined globally since it's merely a geometric relation between abstract objects in the space, but you may have to use several coordinate systems patched together depending on where you are in the space if you actually want to do analytic geometry, and so the Leibniz equivalent of the fluxion is not a simple translation.

At this point I cannot recommend Cohen's translation of Principia too highly. I didn't feel like I understood classical mechanics of point particles until I had really internalized the first chapter of that and spent three months staring into space forcing my brain to think that way...and reexpressing all the mathematical results I knew in that visualization.

Posted by: Fred Ross | September 30, 2006 at 11:34 PM

"the entire discipline is encapsulated in two fundamental ideas: (1) the derivative... and (2) the integral... Everything else involved in calculus is just variations on these two themes."

While I agree that this is what mathematicians mean by "Calculus", I would add that Calculus is a subfield of a broader topic called "Analysis", and within Analysis everything, including the derivative, the integral, and a whole lot of other things beside, are just variations on just one theme: "The Limit". The Limit is where Zeno's arrow strikes its target.

Oh, and another thing. Calculus doesn't go "to infinity and beyond", although it does approach infinity sometimes in The Limit. If you actually want to go to infinity and beyond you need to investigate the theory of transfinite arithmetic - a whole 'nother topic. Ask your supervisor about Cantor.

Good luck with your exploration.

Posted by: Daran | September 30, 2006 at 11:37 PM

Especially helpful comments here -- thanks. I appreciate all the input, and it'll be nice to come back to them as I progress.

Daran, I think we're covering the notion of the Limit in a later lecture. But, FYI: the post title "to infinity and beyond" was a tongue-in-cheek pop culture reference (TOY STORY), not a literal reference to calculus...

Posted by: JenLucPiquant | October 01, 2006 at 12:14 AM

Great essay! Just a few comments: I presume you mean

ChristiaanHuygens? He had a (crude) wave theory of light competing with Newton's particle theory of light.To show how awful Newton was, I remember two anecdotes: Robert Hooke was a very short man, so when Newton said "If I saw further than others, it is because I stood on the shoulders of giants", this is a direct dig at Hooke. Second, the Royal Society sided with Newton, because he was the president of the Royal Society at the time!

Posted by: PK | October 01, 2006 at 05:40 AM

Hi Jen, enjoyed this here post, and, glad to hear you haven't lost a chunk from your knee in your previous post.

So just for fun concorde flying London to New York could expect to arrive there before it left, and expect to be travelling against time on the return journey. The actual average speed (time) would be determined by the prevailing currents and winds.

Does an arrow, all things being equal (strength of bow and prevailing wind) travel the same distance whether fired East or westward, the earth's spin (and time) having no measurable(?) or practical(?) impact on distance travel. Equally so if the arrow is fired North or Southward.

With more powerful and longer distance ballistics, do the measurements become much more 'critical'?

Posted by: Quasar9 | October 01, 2006 at 01:15 PM

Aerik is right! It's a lot of fun to watch someone blogging about learning calculus. Keep it up!

And Aerik's teacher is wrong! You need the units to keep yourself from making mistakes. If your final answer is supposed to be in meters and you're getting kilogram seconds-squared/degree Kelvin, you've done something seriously wrong.

Jen, once you've got Calculus under your belt (oh, and Vector Calculus, too, you don't want to miss that), where to next? Differential Equations? Group theory? Linear Algebra? Complex numbers?

I feel like a kid watching another kid in a candy store.

Posted by: andy.s | October 01, 2006 at 06:38 PM

For those who haven't encountered Zeno before, he was a Greek philosopher living in the 5th century B.C., who thought a great deal about motion. Some might argue he thought a bit too hard about motion; the guy was always playing devil's advocate, even with his own arguments, which is how he arrived at his eponymous paradox.Philosophical note: Actually, Zeno had seveal paradoxes, three of which are famous. His purpose wasn't to play devil's advocate, but to argue for the unreality of time and space, and the phenomenal world in general. THis was a popular position with the ancient Greeks -- see Plato's cave analogy -- but has some adherents even today. For example, the hardheaded 19th century philosopher McTaggart argued for the unreality of time using updated Zenoesque arguments.

What you present as a "corollary" isn't a corollary to "zeno's paradox", but a separate paradox of Zeno's known as the arrow paradox.

The meme that "Calculus solves Zeno's (first) paradox" is one of those items of conventional wisdom that goes unquestioned despite having such a shaky foundation. Calculus may resolve Zeno's paradox, but that's certainly not uncontroversially or straightforwardly true.

Posted by: Jason Stokes | October 01, 2006 at 11:37 PM

Oh, and by the way, speaking of chocolate math, you can also determine the speed of light using chocolate.

(Is there anything chocolate can't do?)

Posted by: andy.s | October 02, 2006 at 12:25 AM

Daran, while you're correct that the standard treatment of calculus shuns actual infinities in favour of limits, there's also a more modern technique using non-standard analysis (not the same as transfinite arithmetic) that goes straight for infinite quantities first. In this approach a derivative really is the slope of an infinitesimal line segment and an integral really is an infinite sum of infinitesimal quantities.

Posted by: Jeremy Henty | October 02, 2006 at 06:18 PM

I'm a little late to this party, but I have to say that I learned Calc originally from a guy with a Ph.D. in Chem. Eng. and 35 years in dustry before he went off to become a college prof, and he was the best instructor I've ever had at any level of mathematics (the only one who comes close was my 7th grade algebra instructor who was an aeronatical engineer...this would seem to suggest something).

Pure mathematicians get too wrapped up in the abstractness of it all, and I've yet to meet one who understood that talking to us like we already knew what they were teaching us was pointless, since if we did, we wouldn't be wasting our time in their class.

And textbooks...written by math Ph.D.s, for math Ph.D.s, and to hell with anyone who doesn't understand.

Don't get me wrong, I love mathematics. I just can't stand most of the people who've ever attempted to teach it to me.

Posted by: Jamie Bowden | October 04, 2006 at 09:22 AM

I'm a little late, but for completeness's sake I thought I should post a link to a textbook on this "non-standard analysis" of which Jeremy Hinty spoke.

http://www.math.wisc.edu/~keisler/calc.html

Posted by: Blake Stacey | October 04, 2006 at 04:49 PM

My apologies --- "Henty", not "Hinty". Stupid fallible fingers. :-/

Posted by: Blake Stacey | October 04, 2006 at 04:50 PM

thanks for the bloigs!! got so many ideas aquired!

Posted by: Mosaics | April 13, 2007 at 02:23 AM

I too developed an interest in mathematics (number theory, what Algebra really means, etc.) late in life (50'). I've since realized that most of the math "teachers" I had in school were either incompetent to begin with, or knew the subject but NOT how to teach! It's a national disgrace that we have allowed this to happen in the US for so many decades, especially in the public schools.

Posted by: MJ DeYoung | April 19, 2008 at 05:28 PM