Hang on a second: clocks at a Hong Kong clock and watch fair. Photo: Getty
Show Hide image

The pros and cons of leap seconds

The slowing pace of the earth’s spin means that occasionally we have to add on a second – but should this practice continue?

The British science minister, David Willetts, wants your input on an issue you’ve probably never even thought about. The question, in essence, is this: would you care if, in 800 years’ time, the sun was at its highest point overhead at 1pm, rather than today’s 12 noon?

There’s an international scientific kerfuffle over this. It is prompted by the changing pace of the earth’s spin. The moon and sun pull on our planet, slowing its rotation and giving us an ever-lengthening day. The effect is tiny – adding less than two-thousandths of a second per day – and it is not consistent. Sometimes, the rotation even speeds up for a while. We’re not sure why but we think it is because interactions between earth’s liquid iron core and the rocky mantle that surrounds it can exert an accelerating effect. Ocean currents also seem to speed up the pace at which the

world turns. In the long term, though, we know the days are getting longer. As a result, occasionally, to keep our clocks in sync with when we expect sunrise and sunset to occur, we have to add a “leap second”.

It sounds easy but it’s not. For 14 years, countries have been debating whether the practice of adding a leap second should continue. Shoehorning an extra second into the clocks of computer programs can create software glitches that have widespread effects. In 1998, for instance, the insertion of a leap second caused a mobile-phone blackout across the southern United States because different regions were suddenly operating with time differences outside the error tolerances. Then in 2012 an airline’s booking system went belly-up for hours after a leap second insertion. The US department of defence has argued vociferously that the leap second compromises the “safety and reliability” of certain systems; scaremongers talk about missiles and air-traffic control systems going awry in some such future adjustment.

One solution to this is to let our clock readings gradually drift away from any association with the position of the sun in the sky. After all, who cares?

Well, you – perhaps. Britain is one of very few nations that have battled to keep the leap second. Most countries are happy to let the clocks drift away from “solar time”. The reason for Britain’s reticence is largely to do with ministerial gut feeling about our sense of cultural heritage: the time of day has always been linked with the position of the sun in the sky and why should we abandon that just because some programmers can’t do their job properly? In April, the UK government launched a public consultation to find out what you think (full disclosure: I am on the consultation’s oversight committee checking that the process is fair and frank).

There are potential issues with abandoning the leap second. Human beings have always lived by sunrise and sunset; our biology responds to rising and fading light levels. Without leap seconds, or some other adjustment of time, noon in the year 4000 will occur in total darkness. Also, the sun’s position in the sky plays a role in the timing of certain religious observances. Whether the link to the numbers on a clock face matters in these instances is as yet unknown, hence the consultation. Can we justify dropping the leap second – and maybe redefining “noon” – just because of computer programming problems?

On the other hand, some will argue that we cope with time zones and daylight saving time; why would we care about a second every few years? That’s for you to answer, if you care enough to bother.

Michael Brooks holds a PhD in quantum physics. He writes a weekly science column for the New Statesman, and his most recent book is At the Edge of Uncertainty: 11 Discoveries Taking Science by Surprise.

This article first appeared in the 01 May 2014 issue of the New Statesman, The Islam issue

Metro-Goldwyn-Mayer Pictures
Show Hide image

The one where she turns into a USB stick: the worst uses of tech in films

The new film Worst Tinder Date Ever will join a long tradition of poorly-thought-through tech storylines.

News just in from Hollywood: someone is making a film about Tinder. What will they call it? Swipe Right, perhaps? I Super Like You? Some subtle allusion to the app’s small role in the plotline? Nope – according to Hollywood Reporterthe film has been christened Worst Tinder Date Ever.

With the exception of its heavily branded title (You’ve Got Gmail, anyone?), Worst Tinder Date Ever follows neatly in the tradition of writers manhandling tech into storylines. Because really, why does it matter if it was a Tinder date? This “rom com with action elements” reportedly focuses on the couple’s exploits after they meet on the app, so the dogged focus on it is presumably just a ploy to get millennial bums on cinema seats.  

Like the films on this list, it sounds like the tech in Worst Tinder Date Ever is just a byword for “modern and cool” – even as it demonstrates that the script is anything but.

Warning: spoilers ahead.

Lucy (2014)

Scarlett Johansson plays Lucy, a young woman who accidentally ingests large quantities of a new drug which promises to evolve your brain beyond normal human limits.

She evolves and evolves, gaining superhuman powers, until she hits peak human, and turns into first a supercomputer, and then a very long USB stick. USB-Lucy then texts Morgan Freeman's character on his fliphone to prove that: “I am everywhere.”

Beyond the obvious holes in this plotline (this wouldn’t happen if someone’s brain evolved; texting a phone is not a sign of omnipotence), USB sticks aren’t even that good – as Business Insider points out: “Flash drives are losing relevance because they can’t compete in speed and flexibility with cloud computing services . . . Flashdrives also can’t carry that much information.”

Star Wars: The Force Awakens (2015)

If you stare at it hard enough, the plotline in the latest Star Wars film boils down to the following: a gaggle of people travels across space in order to find a map showing Luke Skywalker’s location, held on a memory stick in a drawer in a spherical robot. Yep, those pesky flash drives again.

It later turns out that the map is incomplete, and the rest of it is in the hands of another robot, R2-D2, who won’t wake up for most of the film in order to spit out the missing fragment. Between them, creator George Lucas and writer and director JJ Abrams have dreamed up a dark vision of the future in which robots can talk and make decisions, but can’t email you a map.

Willy Wonka and the Chocolate Factory (1971)

In which a scientist uses a computer to find the “precise location of the three remaining golden tickets sent out into the world by Willy Wonka. When he asks it to spill the beans, it announces: “I won’t tell, that would be cheating.


Image: Paramount Pictures. 

The film inhabits a world where artificial intelligence has been achieved, but no one has thought to pull Charlie's poor grandparents out of extreme poverty, or design a computer with more than three buttons.

Independence Day (1996)

When an alien invasion threatens Earth, David Levinson (Jeff Goldblum) manages to stop it by hacking the alien spaceship and installing a virus. Using his Mac. Amazing, really, that aliens from across the universe would somehow use computing systems so similar to our own. 

Skyfall (2012)

In the Daniel Craig reboot of the series, MI6’s “Q” character (played by Ben Whishaw) becomes a computer expert, rather than just a gadget wizard. Unfortunately, this heralded some truly cringeworthy moments of “hacking” and “coding” in both Skyfall and Spectre (2014).

In the former, Bond and Q puzzle over a screen filled with a large, complex, web shape. They eventually realise it’s a map of subterranean London, but then the words security breach flash up, along with a skull. File under “films which make up their own operating systems because a command prompt box on a Windows desktop looks too boring”.

An honourable mention: Nelly and Kelly Rowland’s “Dilemma” (2009)

Not a movie, but how could we leave out a music video in which Kelly Rowland texts Nelly on a Microsoft Excel spreadsheet on a weird Nokia palm pilot?


Image: Vevo.

You’ll be waiting a long time for that response, Kelly. Try Tinder instead.

Barbara Speed is a technology and digital culture writer at the New Statesman and a staff writer at CityMetric.