Alexandre Borovik's post reminded me of the good old days when I was a math undergrad and would pay an annual visit to my 6th grade Gifted&Talented math teacher. She let me teach a lesson, which proved popular enough that the following year I got two periods in a row. The class consisted of that rare breed of American 12-year-olds who are genuinely hungry for mathematical knowledge and can appreciate beauty and aesthetics when they see it. These kids, I remind you, have never seen calculus or trigonometry, and don't have a solid grasp of the real number field.
I would start out by writing three expressions on the board:
(a) 1 + 1/2 + 1/4 + 1/8 + ...
(b) 1 + 1/2 + 1/3 + 1/4 + 1/5 + ...
(c) 1 - 1 + 1 - 1 + ...
and ask them what each series sums to. Most had no difficulty getting (a) right, and many could even give a pictorial argument to support their answer. (b) gave them more trouble, and who can blame them? It's far from obvious, at first glance, how the harmonic series behaves; even using a computer and making plots, one would become frustrated by the slow divergence. I let the cat out of the bag by telling them that it does in fact blow up, but they'll have to wait for college and Baby Rudin to become really convinced. [Incidentally, as all intelligent 12-year-olds, they are fascinated and confused by the concept of infinity. I glossed over it at this point, saying that "infinity" in this case is shorthand for "increasing without bound" or becoming "as large as you want if you add up enough terms". We'll come back to it shortly!] But what to do about (c)? One hardly needs a computer simulation or a formal proof to see that something is not kosher about that expression. If (b) can be called a "number" in some extended sense, there is no sense in which (c) makes sense a number. So, lesson number one: just because we can write it down, in numbers and symbols, does not necessarily mean it's well-defined or makes sense.
I would go on to ask if anyone had heard of irrational numbers and could name an example of one. A few hands would go up -- "pi" would be the standard answer. Whether anyone knew why, or could formally define irrational numbers, is another matter. Well, we didn't prove pi's irrationality, but we did define rationals and irrationals, and more importantly, we'd see a rigorous proof of the existence of irrational numbers! For most of the kids, this was the first time seeing a formal proof, and by the looks on the faces, I could see that a paradigm shift was taking place. What a novel concept -- you don't just accept facts handed down from authority, but become convinced of them through a waterproof argument!
That would take about 45 minutes. In the next 45, I'd prove -- again, in full rigor -- Cantor's diagonalization theorem of the uncountability of the reals. There is absolutely nothing in the definition of cardinality or in Cantor's proof that is beyond the grasp of a 12-year-old. They were getting it; I can vouch for that. I have no idea why we make students wait until college to see this.
Subscribe to:
Post Comments (Atom)
2 comments:
wow, they must have been geniuses! (well the right spelling of that word anyway!)
I'm going to read that post again when my brain is out of first gear!
Perhaps I should've linked to Cantor's theorem.
If anything in what I wrote above is unclear please don't hesitate to ask!
Post a Comment