Plimpton 322

While I think my comments are still interesting, I recommend everyone be sure to review TTAndy's great reply providing a more insightful explanation of the true mathematical content of this story.

I recently came across the Guardian's coverage of Daniel Mansfield's work on an ancient mathematical tablet. I found the coverage to be deeply disappointing, in no small part to the misleading way, IMO, that Mansfield presents the result.

The actual news seem to be that Mansfield has figured out how to read this ancient tablet. The notion had been lost to time previously. That's a legitimate achievement for which Mansfield should be lauded. The translation revealed that this tablet contains a trigonometric table and evidence that the Babylonians understood Pythagoras's famous and deeply useful Pythagorean theorem ($a^2 + b^2 = c^2$). This moves the discovery about 1000 years earlier in the human timeline! Again, a discovery we should celebrate and one historians of mathematical discovery (and cultural forgetfulness) should take a keen and deep interest.

My objections stem from Mansfield's claim that this table "is superior to modern technology" and the implication that we can learn something from it and possibly adopt it for applications in areas like surveying. While there is a grain of truth in this statement that I will explain shortly, this a rather ludicrous claim that I have a hard time believing anyone with a basic understanding of trigonometry would express unless their intentions were only to generate click bait.

Mansfield correctly points out that the Babylonians used a number system base 60, whereas we use a base 10 number decimal system. The base of a number system is how much we divide it up. Think of a delicious pie which represents the unit whole, the number 1.0. That's 1.0 in all bases, in fact 10, 60, or otherwise. Let's assume we cut it into 10 equal pieces and I decide to share half the pie with you. I'd give you 5 slices out of ten, which is the fraction \frac{1}{2} or \frac{5}{10} or decimal form 0.5.

No problems there, but what if I want to give you \frac{1}{3} of the pie, greedily keeping \frac{2}{3} for myself? We'll have to share several of my \frac{1}{10} slices, and a fractional piece to get give you precisely \frac{1}{3}. In base 10, the decimal form is 0.\bar{3}. Since the 3 is repeating, it's not an exact value.

Similarly, you'll find that the majority of rational numbers have repeating decimal values (e.g. \frac{19}{60} = 0.31\bar{6}). Repeating decimals are undesirably because one typically needs to do some rounding at some point, which by definition, means a loss of precision. Working in a numbers base 60 decimal, however, produces far fewer repeating decimals. In fact, when we consider the simple "everyday" fractions that people encounter on recipes and otherwise, there are many cases when we'll divide by halfs, thirds, quarters, fifths, and sixths. At a certain point, the applied use cases for smaller denominator fractions occur less often. My table saw's smallest resolution is \frac{1}{32}, and I never try to make cuts with more precision than this.

If we focus our discussion to the types of fractions listed above, you'll notice that base 10 decimals are sometimes easy to write (e.g. halfs, quarters) but often produce repeated decimals (e.g. many cases of thirds and sixths). In this way, base 60 decimals are truly more expressive. This is absolutely true, and its not that surprising to me that ancient peoples would have settled on this system for this reason.

Yet the implication that this is some grand revelation is false. The implication that human beings had totally failed to observe this detail for all this time is also deeply false. Should anyone be interested in doing so, there's nothing stopping them for working in base 60 numbers. Yet, almost no one does. Why not?

I think there are a variety of reasons why not. First and foremost, modern computing. Our modern computers run using base 2 decimals (binary). Yet, it is trivial for any programmer to develop software which runs in any arbitrary base numbering system (10, 60, or otherwise). Most if not all programming languages offer functionality to users in base 10 decimals by default. There seems to exist no market demand for a language configured explicitly to prefer base 60 numbers, despite the fact that a moderately above average high school student is generally aware of the advantages I previously described.

The core reason I believe no one is interested in base 60 numbers is because the precision available with single-precision floating-point numbers expressed in the IEEE standard format are significantly more precise than the vast majority of users require. Double precision (64 bit floating point numbers) are readily available in most programming languages, and even higher precision computation is available if that's what one needs.

Further, human beings are widely trained and educated in the use of base 10 decimals. These are accessible even to individuals who were let down by our education system which failed to provide them basic mathematical training in an intelligible manner. Everyone gets decimals. Even the most advanced mathematicians would be likely to introduce a higher rate of errors in their work if trying to switch between base 10 and base 60 numbers in different situations. Several major tragedies have actually occurred as a result of confusion between metric and imperial numbers. For me, the likelihood of introducing a blunder is far greater and more detrimental than any weak advantage I might have using base 60 numbers.

The advantages of base 60 are very real when one does a lot of hand calculations. However, rarer and rarer are the cases where a human being doesn't have a readily available computer to assist. For these and many other reasons, the idea that we'd see any broad adoption of a base 60 numbering system is such an outlandish claim that I struggle to see it as genuine.

It's a real shame that the story has this angle. There's a beautiful story here beneath the hype. An ancient text was translated which rewrites our understanding of what ancient peoples knew about mathematics and how they used it. It's also a sad tale about a scientific setback. How much further along scientifically would our species be if the time we spent re-discovering this concept could have been spent advancing our body of knowledge. That reminds me, I need to go make a donation to Wikipedia. I'm also reminded of the tragedy of tragic story of the Palimpsest of Archimedes, but that's a topic for a blog on a different day.