Hang on a second: clocks at a Hong Kong clock and watch fair. Photo: Getty
Show Hide image

The pros and cons of leap seconds

The slowing pace of the earth’s spin means that occasionally we have to add on a second – but should this practice continue?

The British science minister, David Willetts, wants your input on an issue you’ve probably never even thought about. The question, in essence, is this: would you care if, in 800 years’ time, the sun was at its highest point overhead at 1pm, rather than today’s 12 noon?

There’s an international scientific kerfuffle over this. It is prompted by the changing pace of the earth’s spin. The moon and sun pull on our planet, slowing its rotation and giving us an ever-lengthening day. The effect is tiny – adding less than two-thousandths of a second per day – and it is not consistent. Sometimes, the rotation even speeds up for a while. We’re not sure why but we think it is because interactions between earth’s liquid iron core and the rocky mantle that surrounds it can exert an accelerating effect. Ocean currents also seem to speed up the pace at which the

world turns. In the long term, though, we know the days are getting longer. As a result, occasionally, to keep our clocks in sync with when we expect sunrise and sunset to occur, we have to add a “leap second”.

It sounds easy but it’s not. For 14 years, countries have been debating whether the practice of adding a leap second should continue. Shoehorning an extra second into the clocks of computer programs can create software glitches that have widespread effects. In 1998, for instance, the insertion of a leap second caused a mobile-phone blackout across the southern United States because different regions were suddenly operating with time differences outside the error tolerances. Then in 2012 an airline’s booking system went belly-up for hours after a leap second insertion. The US department of defence has argued vociferously that the leap second compromises the “safety and reliability” of certain systems; scaremongers talk about missiles and air-traffic control systems going awry in some such future adjustment.

One solution to this is to let our clock readings gradually drift away from any association with the position of the sun in the sky. After all, who cares?

Well, you – perhaps. Britain is one of very few nations that have battled to keep the leap second. Most countries are happy to let the clocks drift away from “solar time”. The reason for Britain’s reticence is largely to do with ministerial gut feeling about our sense of cultural heritage: the time of day has always been linked with the position of the sun in the sky and why should we abandon that just because some programmers can’t do their job properly? In April, the UK government launched a public consultation to find out what you think (full disclosure: I am on the consultation’s oversight committee checking that the process is fair and frank).

There are potential issues with abandoning the leap second. Human beings have always lived by sunrise and sunset; our biology responds to rising and fading light levels. Without leap seconds, or some other adjustment of time, noon in the year 4000 will occur in total darkness. Also, the sun’s position in the sky plays a role in the timing of certain religious observances. Whether the link to the numbers on a clock face matters in these instances is as yet unknown, hence the consultation. Can we justify dropping the leap second – and maybe redefining “noon” – just because of computer programming problems?

On the other hand, some will argue that we cope with time zones and daylight saving time; why would we care about a second every few years? That’s for you to answer, if you care enough to bother.

Michael Brooks holds a PhD in quantum physics. He writes a weekly science column for the New Statesman, and his most recent book is At the Edge of Uncertainty: 11 Discoveries Taking Science by Surprise.

This article first appeared in the 01 May 2014 issue of the New Statesman, The Islam issue

DAVID LEVENE/GUARDIAN NEWS & MEDIA
Show Hide image

Haystack in a haystack: travels around the human genome

Siddhartha Mukherjee’s book is a tourist guide to the twenty-first century’s uncharted continent, the human genome.

My favourite quotation from Charles Darwin: “Ignorance more frequently begets confidence than does knowledge.” In that brief sentence, the founder of modern biology unknowingly summarised in advance the history of genetics, from the eugenical ideas of his half-cousin Francis Galton to Bill Clinton’s statement that the human ­genome sequence was “the most important, the most wondrous map ever produced by humankind”.

The eugenics movement led to ­disasters known to everyone. It is not yet dead: Francis Crick once claimed that “no newborn should be declared human until it has passed certain tests regarding its genetic endowment”, and our own government’s decision to deny child support to poor people irresponsible enough to have more than two offspring (the agent of the policy has four) is in the same tradition. As a reminder of our ignorance, the DNA chart looks more like a medieval atlas than a modern map – with geneticists, in unconscious parallel to Swift’s words, the geographers who “in Afric maps/With savage pictures fill their gaps,/And o’er inhabitable downs/Place ­elephants for want of towns”.

Siddhartha Mukherjee’s book is a tourist guide to the new Africa, the human genome. The chart of that continent does indeed have too many metaphorical elephants and a noticeable shortage of productive towns: there are only about 20,000 working genes in the conventional sense, rather than the millions once assumed to exist (and why do tomatoes have more than we do?). They are surrounded by vast numbers of more or less mysterious molecular beasts, some of them parasites that invaded long ago, others the mouldering corpses of once-noble creatures, and yet more – the so-called junk – known more in its anatomy than in what it actually does. Lengthy as this book is (and Mukherjee might have gained from turning to his own account of the genome’s ability to cut out redundant and repetitive sections), it gives a full and lively account of the development of the subject, from its birth in the 19th century to its infancy in the 20th and its uncertain adolescence in the 21st.

Mukherjee begins the book with a melancholy tale of the schizophrenia that attacked two of his uncles and his cousin, and caused his own father to worry that elements of the illness “may be buried, like toxic waste, in himself”. Other family members had blamed the madness of their relatives on the horrors of Partition in India in 1947, which led to millions of deaths. Now, however, it has become clear that a predisposition to the condition, and particularly to the variety known as bipolar disorder (doctors have abandoned the old name “manic ­depression”), has a strong hereditary component, and Mukherjee confesses that part of the impetus for writing The Gene: an Intimate History was a personal concern about his own offspring. In this it resembles his 2011 work on cancer, The Emperor of All Maladies, which he describes as a biography rather than a work of popular science.

The problem with genetics is that it lends itself too readily to anecdote. When teaching, I begin my own first-year course on the subject by telling the students: “I am a geneticist and my job is to make sex boring.” They look somewhat bemused, but after 20 lectures that fight through pedigrees, linkage mapping, population genetics, inbreeding, heritability, mutation and the like, I can tell that they agree heartily – and I’ve not even started to talk about the mechanics of sequencing or the horrors of bioinformatics, which have turned much of biology into computer science. Instead, to leaven the mix, and much as I secretly regret it, I plunge again and again into the Swamp of Storytelling and revel in colourful and often tragic tales of Sex, Age and Death (a phrase I once planned to use as a book title but made the mistake of mentioning to Bob Geldof, who stole it for one of his albums).

Mukherjee does the same, and often succeeds. I did not know that Gregor Mendel twice failed in his attempts to enter teacher training college; that the founder of (and donor to) the notorious “genius” sperm bank of the 1980s, the Nobel prizewinner William Shockley, may well have had autism, another condition with some genetic component; nor that the human genome paper was the longest ever published in Nature. And I learned perhaps more than I needed to know about the sordid disagreements between public and private genome mappers, the latter anxious to make millions, even billions, from the map, and the former who saw it as a public good. The good guys won in the end, though the American molecular diagnostics company Myriad Genetics managed to leap in just in time to patent the two genes that can cause breast cancer when they go wrong.

On his trek across the genetical landscape Mukherjee gives an exhaustive account of the development of the modern science of inheritance. He has talked to many of the main players and gives deep insights into their moments of discovery. He does sometimes fall a little too hard for the latest scientific fashion, the most glittering (or tawdry) of which is epigenetics, the interaction between gene and environment. The term was coined by one of my own teachers at Edinburgh, C H Waddington, a student of fruit fly development. He found that a sudden heat shock to the embryos led to the appearance of a few flies with abnormal wings among the adults. By breeding from these, he could obtain stocks that in time produced such flies with no need for a shock, proof that an environmental stress could uncover hidden genetic variation. Unfortunately, the term has been hijacked and turned into a universal bridge between chemistry and biology. It is even used to revive the discred­ited idea that an organism can pass on characteristics acquired in its own lifetime.

That bridge goes far too far. The idea that genes respond to external stresses can be traced to the first days of molecular genetics, when it became clear that some genes regulate the activity of others when a creature is faced with a shift in food, or temperature, or some other external stress. In part it is a statement of the obvious: go out in the summer sunshine and the average Briton will get a tan, because skin cells respond to an alarm call by a protein that senses cellular damage to summon up dark granules of melanin around the DNA in order to protect it. His or her children, though, will be born pink. Quite why there has been such a fuss about a concept invented 70 years ago is not clear and is made no clearer here.

The book ends where it began, with schizophrenia. That illness is a microcosm of Darwin’s aphorism on ignorance. Freud blamed the condition on “unconscious homosexual impulses”, while others were just as confident that it was brought on by hostile mothers. Then the pendulum swung towards treating it as a genetic disease almost as straightforward as haemophilia. Some cases, like those described in Mukherjee’s opening pages, do indeed run in families, but many more are sporadic and appear among kindred that have no history of the problem. For the latter, the new genetics has revealed hundreds of gene mutations in affected children that are not present in their parents. For the former, the story is not so simple. Certainly, genes that predispose to the condition can be passed on, but various families may inherit different genes yet show similar symptoms, and particular combinations of genes rather than single elements may be responsible for the illness.

As this book puts it, the search for the genes behind mental disorder is not like searching for a needle in a haystack, but for a haystack in a haystack. Even for highly heritable attributes such as height, the quest for genes has been baffling, given that more than a hundred are known to be involved in such variation but altogether do not represent even a tenth of the number needed to explain the similarity of parents and children. Unpalatable as this may be for us mere Mendelians, almost every human gene, in effect, may influence almost every one of our attributes, which will be no fun for tomorrow’s molecular cartographers. Even so, and tangled as it already is, Mukherjee does a good job of cutting away the web of ambiguity and complexity that scientists have woven since the happy days when Mendel counted the ratio of round to wrinkled peas in the garden of Brno’s abbey.

Another Darwin quotation, this one from The Voyage of the Beagle:

There are several other sources of enjoyment in a long voyage . . . The map of the world ceases to be a blank; it becomes a picture full of the most varied and animated figures. Each part assumes its proper dimensions: continents are not looked at in the light of islands, or islands considered as mere specks, which are, in truth, larger than many kingdoms of Europe. Africa, or North and South America, are well-sounding names, and easily pronounced; but it is not until having sailed for weeks along small portions of their shores, that one is thoroughly convinced what vast spaces on our immense world these names imply. 

Very true, but for his genetical descendants the expedition has only just begun. 

Steve Jones is Emeritus Professor of Human Genetics at University College London and the author of “No Need for Geniuses: Revolutionary Science in the Age of the Guillotine” (Little, Brown)

The Gene: an Intimate History by Siddhartha Mukherjee is published by Bodley Head (608pp, £25)

This article first appeared in the 16 June 2016 issue of the New Statesman, Britain on the brink