| Journal | 5 min read
Twenty years ago, Australia was in the throes of a culture war. The prize was the definition of the millennium, and the belligerents were the establishment versus the populace. The former followed history and mathematics, arguing the millennium would begin in 2001. The latter wanted an excuse for fireworks and an epic piss up — and they didn’t want to wait another year. The establishment ended up losing the culture war, partly because epic piss ups and blowing up tonnes of fireworks are good for GDP and tourism.
Fast forward 20 years, and people are making similar arguments about the start of the 2020s. As was the case in 1999, they are wrong, at least from a mathematical standpoint.
The reasons is our calendar doesn’t have a year zero, that is no 0 AD. We went from 1 BC to 1 AD. The chap who created the first version of our calendar didn’t understand the concept of zero because it wasn’t invented until the Middle Ages. He also miscounted the number of Roman Consuls (which he used to calibrate the years from his time to the year he believed Jesus of Nazareth was born)1.
So, as the calendar started in 1 AD, we count as follows:
That’s the first decade, ten years — go ahead and count them on your fingers if you don’t believe me. The second decade unfolds as follows:
The 10th decade, rounding out the first century AD was as follows:
So, the second century started in AD 101.
If you have the patience, or a spreadsheet, follow that number line through the centuries and the 20th century began in 1901, the 21st in 2001, the 2020s will start in 2021.
To break this number line, at some point you must either ignore year zero, or you must recalibrate the calendar. For example, you could arbitrarily decide there was only 99 years in the 1st century AD, or only 99 years in the 20th century.
It’s not rocket science, it’s mathematics. It’s the same logic that makes this the 21st century when the years begin with 20xx.
Part of the problem is the ambiguity of English itself, and how we use it. We use ordinals to describe centuries and millennia, while decades are referred to as date-range monikers, as in the eighties, nineties and so on. So the years 1990-1999 could reasonably be described as the 90s, even though mathematically it’s not accurate because as I demonstrate above, the last decade of the 20th century began in January 1991 and ended in December 2000. Linguistics aside, we do this for cultural reasons: The Roaring 20s, the Swinging 60s — each decade has a distinct cultural flavour thanks to the arts (especially music and film), fashion, technology, demographic changes, politics and world events. Even so, there’s considerable overlap, culturally speaking 2000 wasn’t much different from 1999, 1998 or 1995 for that matter.
Of course, the real reason is that time is arbitrary, I discovered this when I visited Greenwich in 2009 and straddled the Prime Meridian for a silly holiday photo. As hairless apes our perception of time is rudimentary and relative to our psychological state. We have a crude circadian rhythm, sensitive to day and night. Yet, our brain ignores the minutiae of time when we are having fun, and exaggerates it when we are bored or miserable.
As for calendars, unless you are a practicing Christian or a historian, you likely don’t give a toss about the origins of ours. Calendars are there to remind us that the five day working week sucks, and holidays and weekends don’t come around often enough. The passing of a year, decade and century is merely a convenient excuse to get drunk and snog someone while trying to remember the words to Auld Lang Syne.
As a fantasy author and world-builder, I’m fascinated by calendars, particularly those of the medieval and ancient world. The history thereof makes for fascinating reading. For example, although our calendar is essentially Roman, we still divide our day into units invented by the Sumerians, the oldest of civilisations. I think that’s really cool.
The way we measure time tells us a lot about cultures. In England’s Dark Satanic mills, puritanical capitalists browbeat workers with a bible in one hand and a stopwatch in the other. We measure time now in the smallest of units, and our concept of time is inextricably linked to our economy, productivity and the rhythms of everyday like.
Preindustrial societies were far less precise, with daylight and the passage of seasons governing people’s lives. It’s from the Romans we inherit the idea that a day begins at midnight. The Celts and Anglo-Saxons on the other hand marked the beginning of the day at sunset. Very different societies with very different attitudes to time and the natural world.
So, culture trumps mathematics, and popular opinion often flies in the face of reason and scholarship — that’s why urban myths, climate change denialism and fake news can thrive. On Twitter people were quite incensed that I could hold my position in the face of the new cultural norms. I do love stirring the pot!
Yet, debates over when decades and centuries start and end aren’t really important any more. The history of time is one of historical mistakes, mathematic and technological failures and triumphs, and cultural appropriation.
If people want to believe the millennium started on the 1 Jan 2000 AD2, or Jesus was born on 25 Dec 1 AD3, that’s fine. Just accept that these are cultural conventions and not mathematic or historical truths. Otherwise I will argue with you because I am pedantic bastard and the scholarly community is on my side.
Regardless, I don’t really care, I just enjoy arguing. As for the New Year’s Eve and what decade tomorrow will be… well, I like a good piss up and an impressive display of pyrotechnics as much as the next person — or did when I was 18. Now I prefer to be in bed at a more civilised hour.
So, for what it’s worth, I wish you all a happy new year, but happy new decade…that can wait another year.
Cover photo by Fabrizio Verrecchia on Unsplash