On Irrational Numbers and Deep ThinkingMarch 15th, 2020 · 20 comments
Today is March 14th, which to us math nerds is also known as Pi Day, in reference to the first three significant digits of the mathematical constant pi (3.14).
Part of what makes pi interesting is that it’s one of the most famous irrational numbers, meaning that it cannot be expressed as the fraction of two whole values (though 22/7 comes pretty close). In honor of Pi Day, I thought I would learn more about the history of irrational numbers, so I turned to one of my bookshelf favorites, The World of Mathematics: a four-volume history of math, edited by James R. Newman, and published in a handsome faux-leather box set in 1956. (I picked up my copy at a used book sale five years ago.)
The first volume contains an extended essay (originally a short book) titled “The Great Mathematicians,” written by Herbert Western Turnbull, the late Scottish algebraist. Turnbull dedicates much of the essay to great innovations from ancient Greek mathematics. It was Pythagoras (570 – 495 BC), he notes, who is most often credited for discovering that irrational numbers exist.
We know something of the proof that Pythagoras used due to a later account by Aristotle. The proof is elementary by modern standards (indeed, it’s a common example in undergraduate-level discrete mathematics courses). It goes something like this…
Consider a square with sides of length one. Applying the Pythagorean Theorem (which, of course, was brand new in Pythagoras’ time), it follows the length of the diagonal of this square is the square root of 2. It is this common value that was shown to be irrational.
To make this claim, let us assume, for now, that the square root of 2 is a rational number. We can show this will lead to a logical contradiction. This march towards a contradiction unfolds as a rapid-fire sequence of basic number theory and algebraic claims:
- If the square root of 2 is rational, then it can be written as x/y, for two whole numbers x and y that share no factors in common, which implies…
- that if we square both sides, then 2 = x^2 / y^2, and therefore, x^2 = 2 * y^2, which implies…
- x^2 is even (since it is expressed as 2 times another whole number), which tells us that x must also be even, because if you square an odd number you would get an odd number, which implies…
- x^2 = (2k)^2 = 4k^2, for some whole number k, which, after doing some algebra, tells us that y^2 = 2k^2, which implies…
- that y^2 is even, and therefore y is even (by the same arguement that we applied to x^2).
We have now shown that both x and y are even. But earlier we assumed they had no factors in common. If they are even, they would have a factor in common (namely, 2). This is a contradiction! Math cannot have contradictions, so our original assumption that the square root of 2 is rational must have been wrong.
We might think of this proof as easy, but as Turnbull argues, the result should not be dismissed:
“That will ever rank as a piece of essentially advanced mathematics. As it upset many of the accepted geometrical proofs it came as a ‘veritable logical scandal.'”
The discovery of irrational numbers turned out to be more than just a logical scandal, it also caused theological issues. Pythagoras’s cult had been convinced that everything in the universe could be reduced down to whole numbers. The idea that values existed that could not be expressed with whole numbers alone destabilized this understanding of existence — spawning a number-theoretic crisis of faith.
It’s worth briefly visiting this history of irrational numbers because it’s interesting, and because today we celebrate one such value. But for anyone who shares my interest in cultivating a deep life, this story holds a more compelling layer. It reminds us that there was a time, almost 3000 years ago, when a select group of lucky ancient philosophers could build an entire system of meaning out of simply thinking hard and then marveling at what they discovered.
The mind, when properly cultivated, really does contain multitudes.