- Paul Anthony Jones

# Eleven

A few days ago, we tweeted this:

Which led to this:

It’s a good question—why do we say *eleven* and *twelve*, but then *thirteen* and *fourteen*? Why not *oneteen* and *twoteen*? Or *threelve* and *fourlve*?

Unsurprisingly __the –____teen____ suffix__ is a derivative of ten. *Thirteen* is literally “three and ten”, *fourteen* is “four and ten”, and so on. It’s a fairly ancient formation: *thirteen* was *þreotene* way back in Old English—a straightforward compound of *þreo*, “three”, and *tene*, a form of “ten”. The same goes for *fourteen* (Old English *feowertyne*), and *fifteen* (Old English *fiftene*), all the way up to *twenty*, which was *twentig* in Old English, or literally “two sets of ten”.

But *eleven* was *enleofan* in Old English, which took its initial *en*– from the Old English word for “one”, *ane*. *Twelve*, likewise, was *twelf*, with its initial *twe*– coming from the Old English word for “two”, *twa*. The remaining –*leofan* and –*elf* parts have nothing to do with the “–teen” suffix we know today, but instead represent hangovers from __some ancient, pre-Old English word__ probably meaning “to leave over”, or “to omit”.

So *eleven* was literally the number “left over” after you’d counted up to ten, and *twelve* was literally “two left over after ten”.

But why were *eleven* and *twelve* given different names from all the other teens? Why weren’t they just *ane-tene* and *twa-tene*?

The problem is that we’re now hardwired to think of our numbers decimally, in 10s, 100s and 1,000s. There’s a good reason for doing so of course, as 10 is such an easy number to work with. You can count to 10 using your fingers (which is called * dactylonomy*, by the way), and calculations involving 10 are effortlessly simple. 79 multiplied by 10, you say? 790. Easy.

But this decimal way of thinking is a relatively recent invention, spurred on by the development of the metric system in the Middle Ages. Historically, many of our numbering and measuring systems were based around 12, not 10—and hence there are twelve inches to a foot, two sets of twelve hours to a day, and so on.

12 a much more complicated number to deal with arithmetically, of course (79 multiplied by 12? Give me a minute...) but there’s a very practical reason for counting in terms of 12 rather than 10: 12 is a much more productive number mathematically.

A set of 10, for instance, can only be split equally into two sets of five, or five sets of two. A set of 12, however, can be split into 2, 3, 4 or 6. Likewise a set of 20 can only be divided into 2, 4, 5 or 10, but a set of 24 can be divided into 2, 3, 4, 6, 8 or 12. And even 100 has barely half the number of factors (2, 4, 5, 10, 20, 25, 50) of 144 (2, 3, 4, 6, 8, 9, 12, 16, 18, 24, 36, 48, 72).

The fact that 12 could be so conveniently divided in so many different ways made it particularly useful, in everyday terms, in dealing with fractions, proportions, allocations, and measurements. It even led to some separate words for a set of twelve (*dozen*) and a set of twelve twelves (*gross*) entering our language, and to many ancient number systems using a base of 12, not 10.

As a result, *twelve*—and thereby *eleven*—earned themselves names distinct from all those around and above them, and it’s only our modern, decimal-based perspective that makes this seem strange.

Aha—948! Got there eventually...