Show Hide image

The New Depression

The business and political elite are flying blind. This is the mother of all economic crises. It has

We are living through a crisis which, from the collapse of Northern Rock and the first intimations of the credit crunch, nobody has been able to understand, let alone grasp its potential ramifications. Each attempt to deal with the crisis has rapidly been consumed by an irresistible and ever-worsening reality. So it was with Northern Rock. So it was with the attempt to recapitalise the banks. And so it will be with the latest gamut of measures. The British government – like every other government – is perpetually on the back foot, constantly running to catch up. There are two reasons. First, the underlying scale of the crisis is so great and so unfamiliar – and, furthermore, often concealed within the balance sheets of the banks and other financial institutions. Second, the crisis has undermined all the ideological assumptions that have underpinned government policy and political discourse over the past 30 years. As a result, the political and business elite are flying blind. This is the mother of all postwar crises, which has barely started and remains out of control. Its end – the timing and the complexion – is unknown.

Crises that change the course of history and transform political assumptions are rare events. The last came in the second half of the 1970s, triggered by the Opec oil price spike and a dramatic rise in inflation, which marked the end of the long postwar boom. Its political consequences were far-reaching: the closure of the social democratic era, the rise of neoliberalism, the discrediting of the state, the embrace of the market, the undermining of the public ethos and the espousal of rampant individualism. For the next 30 years, neoliberalism - the belief in the market rather then the state, the individual rather than the social - exercised a hegemonic influence over British politics, with the creation of New Labour signalling an abject surrender to the new orthodoxy.

The modalities of this present crisis are entirely different. Extreme as they may have appeared to be at the time, the economic travails of the 1970s were progressive rather than cataclysmic. The old system did not hit the wall, but became increasingly mired and ineffectual. What swept the social democratic era away was not the force de frappe of an irresistible crisis but that it was accompanied by the steady rise of a new ideology and political force in Thatcherism - and Reaganism in the United States - and its victory in the 1979 general election.

In contrast, the financial meltdown of 2007-2008 demolished the neoliberal era and its assumptions with a suddenness and irresistibility that was breathtaking. The political class, from New Labour to the Conservatives, is standing naked. They are still clinging to the wreckage of their old ideas while acknowledging in the next breath that these no longer work. The financial crisis is a matter of force majeure; political ideas and discourse change much more slowly, even when it is obvious that the old ways of thinking have become obsolete. Meanwhile, there is no political alternative waiting in the wings, refining its radical ideas in think tanks ready to storm the citadels of power as there was in the 1970s, notwithstanding the fact that think tanks are now far thicker on the ground. Instead, it has been the mainstream which senses that neoliberalism no longer works, fatally undermined by events and, ultimately, the author of its own downfall. This crisis will have the most profound and far-reaching political consequences and will in due course transform the political landscape, but it remains entirely unclear in what ways and when that might be.

In all these senses the financial meltdown has far more in common with the Great Depression than the Great Inflation. When the financial crisis consumed Wall Street in 1929 and proceeded to undermine the real economy, engulfing Europe in the process, it was not accompanied by a radical shift towards Keynesianism, but rather a reassertion of sound finance orthodoxy, followed in due course by the adoption of protectionism. The political mainstream as represented by Labour's Ramsay MacDonald and Philip Snowden and the Conservative Stanley Baldwin all sang from the same hymn sheet. Only Keynes and a faction of the Liberal Party enunciated a plausible alternative. Eventually a programme of fiscal deficits and public works was pursued by Franklin D Roosevelt in the United States, but in Britain Keynesianism was not properly embraced until rearmament and the approach of war. Indeed, it was not until 1945 that the combined legacy of war and the Depression belatedly resulted in a fundamental political realignment and the birth of the social democratic era.

The Grim Reaper has finally spoken:

a boom pumped up by credit steroids and a bust that takes us back to the 1930s

Since the financial meltdown dramatically intensified in September 2008, Gordon Brown has managed to ride the economic storm rather more successfully than the Conservatives, or, for that matter, than Tony Blair would have done. It is Vincent Cable, the Liberal Democrats' econo­mics spokesman, however, who has indubitably emerged as the political sage, unafraid of confronting neoliberalism's shibboleths, demonstrating a clarity of mind and the political courage to tell things as they are, in a way that has escaped all other prominent politicians. Although Brown was the economic architect of the past decade and was responsible, more than anyone else, for its excesses and was shaping up to be a rather disastrous Prime Minister, he displayed last autumn, at least initially, an agility of mind and nimbleness of foot that defied the expectations of those who believed he was capable of neither. He revelled in the sense of purpose and vision offered by the crisis, seemingly prepared to jettison the thinking that had imbued his previous decade as chancellor.

But Package Part I, widely hailed at the time and imitated elsewhere, proved woefully inadequate, and the financial system remains frozen. Meanwhile the waters are rising up the Good Ship UK, threatening to transform the banking crisis into a fiscal and currency crisis. It seems unlikely that, if that should happen, Brown will survive the next election.

Even if it does not happen, Brown faces a serious problem about his own past role, because Britain’s crisis has been greatly exacerbated by the soft-touch regulation, easy credit, runaway house inflation and overexpansion of financial services over which he presided and for which he is accountable. So far he has refused to admit or accept responsibility for his actions – he initially had the temerity (or foolhardiness) to argue that the UK was better placed than other countries to deal with the credit crunch, even though it has become abundantly clear since that the very opposite was the case. So while Brown remains in denial, the plausibility of his new turn, and his understanding of what is entailed, must be seriously doubted.

Indeed, after its initial boldness, the government now seems trapped by its past actions and its former ways of thinking. Brown's failure to accept the need to nationalise the banks suggests the limits of his new-found political courage, and his inability to embrace the logic and imperatives of the new situation. He is still a prisoner of his old timidity and his conversion to the neoliberal cause. It is his good fortune that the Cameron Conservatives have been hugely wanting in their response to the financial meltdown. Having spent his first years as leader of the opposition seeking to reassure the country of his centrist credentials, David Cameron, at the first whiff of gunfire, has turned on his heels, rejected Keynesianism and, at the very moment when events have shown Thatcherism to be deeply flawed and historically out of time, headed back to the Thatcherite womb of sound finance, arguing that a government must balance its books and that deficit financing, Keynesian-style, is reckless and irresponsible.

But all this, it must be said, is the small change of politics. The crisis threatens in time to sweep away the political world as we know it and those who fail to grasp its magnitude and meaning. Far more is at stake than the fortunes of a few leaders, be their name Brown or Cameron. Who knows where things will be this time next month, let alone next year or, indeed, in 2012? The financial meltdown now rapidly plunging the western world into what increasingly looks like a depression is the first great crisis of globalisation. There was plenty of warning. The Asian financial crisis of 1997-98 proved a salutary lesson about the dangers posed by huge capital movements that were subject to precious little regulatory control. Three economies capsized (South Korea, Thailand and Indonesia) and others stood on the brink.

There were other earlier warning signs, notably Mexico in 1995, when GDP fell by 9 per cent and industrial production by 15 per cent, following a run on the peso. These crises were blamed on the immaturity and fecklessness of national governments - in the case of east Asia on so-called crony capitalism (which, incidentally, prompts the question of how we should describe Anglo-American capitalism) - which the International Monetary Fund obliged to engage in swingeing cuts in public expenditure as a condition of their bailouts.

Yet what if such a crisis were to be no longer confined to the peripheries of global capitalism but instead struck at its heartlands? Now we know the answer. The crisis has enveloped the whole world like an uncontrollable virus, spreading from the US and within a handful of months assuming global proportions, at the same time mutating with frightening speed from a financial crisis into a fully fledged economic crisis. In so doing, it has undermined the foundations on which the present era of globalisation has been built, namely scant regulation, the free movement of capital, a bloated financial sector and immense reward for greed, thereby bringing into question the survival of globalisation as we now know it.

Enormous international flows of unregulated capital have capsized the international financial system - with disastrous consequences for the real economy - in a manner akin to the effect of a roll-on, roll-off ferry shipping too much water. We can now see the cost of free-market capitalism and light-touch regulation. Iceland may provide an extreme example of the consequences of the credit crunch but it also illustrates the dangers facing the more vulnerable economies, the UK included, in a deregulated world where the market rules: a small, open economy; a large, internationally exposed banking sector; an independent currency that is not a serious global reserve currency (of which there are only three); and limited fiscal strength. These propositions have constituted the core economic beliefs - from Thatcher and Lawson to Blair and Brown - that have informed policymaking over the past three decades and without which, it was claimed ad nauseam, an economy could not succeed. Heavy-handed regulation and an overbearing state would serve only to frighten off capital and condemn a country to slow growth, stagnation and global marginality. Now we know the fallaciousness of these claims and the consequences of "letting the market decide".

Like Iceland, albeit not as extremely, Britain has been living in a fool's paradise. A failure to regulate the banks and other financial institutions in any meaningful fashion allowed bankers to behave in a grossly irresponsible and avaricious fashion; a boom that was made possible only by a government-enabled credit binge in which people borrowed recklessly; a bloated financial sector that grew to represent over 8 per cent of the total economy and which was found to have been built on foundations of sand; an overvalued currency that made manufacturing exports uncompetitive and thereby resulted in an unnecessary and counterproductive contraction in the manufacturing sector which must now be reversed; an absurd belief that boom and bust had been banished for ever, allowing the banks to turn a blind eye to the inflating of various asset bubbles and display a profound ignorance of the history of capitalism; a persistently chronic current account deficit that can no longer be compensated for by inward capital flows; monstrous salaries for those at the top of the financial and corporate tree, which were justified in terms of a trickle-down effect that remained a chimera, and as the reward for risk which was, in fact, a reward for greed and failure; growing inequality, which was justified in the name of a more competitive economy accompanied by declining social mobility in the cause of an open and flexible labour market; and, finally, the mushrooming of what can only be described as systemic corruption on a mega-scale as the state ignored the gargantuan abuses of those who ran the banks and other financial institutions, while regulatory authorities willingly colluded in their excesses.

This is the sad story of the New Labour era.

The ultimate cost of this debacle as yet remains unknown. What began as a financial crisis is threatening, as the government seeks to bail out a bankrupt financial sector, to become a currency crisis, with foreign investors concerned about the effects this might have on the value of sterling, and perhaps even worse, ultimately a sovereign debt crisis, with growing doubts about the UK’s financial viability. Until there is some end in sight to the financial crisis, and a line can be drawn under the banks’ indebtedness, we will not know the answer to these questions. One thing is clear, however: whatever the limitations of the social democratic era, it was never responsible for such an all-enveloping and cataclysmic crisis as the one that the neoliberal era – and the Thatcherites and New Labour – have managed to produce. After all the boasting about the virtues of the Anglo-American model of capitalism, the Grim Reaper has finally spoken: a boom pumped up by credit steroids and a bust that takes us back to the 1930s.

There are two key aspects to this crisis: national and global, with the latter promising to be rather solutions are concerned, we are in uncharted territory, with close to zero interest rates, a Keynesian-style fiscal boost that may prove inadequate to the task and could well fail, a hugely indebted financial sector that threatens to leave us with an enormous future tax burden and a greatly expanded national debt. All of this, furthermore, must be addressed in the context of an open-market regime which is very different from those of previous eras, and which could render Keynesian-style national solutions ineffectual. What would greatly assist any national recovery is a co-ordinated global response to the crisis; in other words, global co-operation at the highest level. This cannot be ruled out, but it would be a brave person that would bet on it. It was exactly the lack of international co-operation that bedevilled recovery in the 1930s and eventually led to the Balkanisation of the world into regional currency and trading blocs.

The most important single question in this context is the relationship between the US and China. Will the Obama administration be able to resist the slippery slope of creeping protectionism? Will arguments over the revaluation of the Chinese renminbi be resolved amicably? If the answer is in the negative, then the global outlook will be very bleak indeed and so, also, as a result, will be the prognosis for national recoveries. Indeed, the prospects would look disturbingly like those of the 1930s, with growing international antagonism and friction and a continuingly intractable crisis at a national level, with only the very slowest of recoveries.

Around the world there is growing evidence by the week of a resort to national solutions at the expense of others: measures to subsidise industries that are in severe difficulties; the Buy American clause that was inserted by the House of Representatives into Barack Obama's latest package (though since weakened); the industrial action in Britain against foreign workers; the withdrawal of banks to their national homes; the attack by Timothy Geithner, the US treasury secretary, on China as a currency manipulator. No Rubicon has been crossed but the warning signs are clear. A retreat into protectionism and beggar-thy-neighbour policies will deliver the world into a second Great Depression.

So what will be the political effects of the financial meltdown? Some are already evident. Just as the Great Inflation of the 1970s played to the tunes and concerns of the right, with its invocation of the market, the New Depression suggests the opposite, the inherent limitations of the market and the indispensability of the state. Indeed, the speed with which the neoliberal refrains and invocations have unravelled has been breathtaking. The single most discredited aspect of the social democratic legacy was nationalisation, and yet the government, with the most extreme reluctance, has been obliged to nationalise Northern Rock and partially nationalise the Royal Bank of Scotland and the merged Lloyds TSB and HBOS. Who would have ever imagined, at any point during the past 30 years, that no less than the financial commanding heights of neoliberalism would have ended up in the hands of the state, with precious little opposition from anyone except a few disgruntled shareholders? Even now, however, the Labour government, still trapped in the ideological straitjacket of New Labour and displaying extreme timidity in the face of powerful vested interests, which has always been a New Labour characteristic, is running scared of the inevitable logic of the situation, namely that all the high-street banks should be taken into public hands until the mess is sorted out. Anything else leaves the public responsible for all the debts and risks, while the banks continue to be answerable to the very different interests of their shareholders. But such is the fury and depth of the crisis that this scenario is highly likely.

The state is experiencing an extraordinary revival. The credit crunch is the most catastrophic example of market failure since 1945. It became almost immediately obvious to wide sections of society that there was only one institution that could potentially sort out the mess: the state. Far from being a rational distributor of resources, the market had proved the opposite. Far from bankers and financial traders embodying the public interest, they have been exposed as irresponsible and dangerous risk-takers whose primary motivation was voracious greed. If trade unionists and the nationalised industries were the demons of the 1970s, bankers and the financial sector have assumed the mantle of public enemy number one in the late Noughties. In fact, the irresponsibility of bankers, and the damage they have inflicted on the economy, hugely exceeds anything that the unions could possibly be held responsible for in an earlier era. Meanwhile, the fallen heroes of the pre-Thatcher era, most notably Keynes, are duly being exhumed, restored to their rightful position, and pored over for their ability to throw light on the present impasse and what might be done; if the recession turns into a depression, Marx will once again become required reading.

This political shift is not just a British phenomenon, but a more general western one. The most striking feature of President Obama's inaugural speech was the way in which it embraced and legitimised African Americans for the first time in American history. But it also had another powerful theme, namely its invocation of the public interest and public service. After decades during which American political discourse has been dominated by the language of individualism and the market, it came as a shock to hear a US president articulate a very different kind of philosophy, renouncing private greed in favour of the public good. Obama's election can in part be seen as a response to the failure of the neoliberal era, as well as of Bush's neoconservative agenda; certainly his election represents a remarkable shift to the left in US politics, in contrast not just to Bush, but every recent US president, including Reagan, Bush Sr and Clinton. That Obama is the first African-American president also represents a remarkable redrawing of the political landscape. There is no more powerful - nor difficult - way of redefining society or to embrace a new form of representivity than to include a racial minority that has been excluded.

This brings us finally to what might be the longer-term global consequences of the crisis. Again, we are inevitably stumbling around in the dark because so much depends on whether the recession metamorphoses into a fully fledged depression and in what way and shape the world eventually emerges from the debacle. That said, two key points can be made. First, the credit crunch signals the demise of the Anglo-American, neoliberal model of capitalism, which has exercised a hegemonic influence over western capitalism and been the blueprint for globalisation since 1980. Because of its catastrophic failure there seems very little chance of its resurrection. The process of recovery - whenever that might be - will be accompanied by an overriding concern to ensure that the events of 2007-2009 are not repeated in the future, just as happened in the US in the 1930s with the strict regulatory framework that was introduced for the banks after their comprehensive failure in 1929. This will include the search for a new global regulatory framework that controls and constrains international movements of capital, as well as strict controls over the financial sector at a national level. A new set of political priorities - and with it a new political language - will be born.

Meanwhile, the influence and prestige that the US, and to a far lesser extent Britain, have enjoyed will vaporise in the same manner as their neoliberal model. Their 30-year project has failed and they will be obliged to pay the price in their reputation and the esteem in which they are held. The countries of the former Soviet Union and the casualties of the Asian financial crisis that were forced to swallow the neoliberal medicine will have good reason to feel aggrieved and resentful. The west has been forthright in accusing the non-western world of corruption. The financial meltdown suggests that the west has been guilty of huge hypocrisy. Systemic corruption has lain at the heart of the western financial system. An entirely disproportionate and extortionate level of bonuses has ensured the enormous enrichment of top executives in the financial sector, all in the name of reward for success, when in fact it was the reward for failure. In addition, we have had the collusion of the credit-ratings agencies; a regulatory system characterised by its failure to act as any kind of constraint; and governments that ensured the continuation of this web of relationships and applauded its achievements. The corruption was on a breathtaking scale as evidenced by the size of the bailouts required to rescue the banks. It will be difficult for western governments to make these kinds of accusations of others in the future. That Obama represents such a voice of hope will help to mitigate the inevitable ill-will towards the US, but this should not be exaggerated amid the euphoria surrounding developments in Washington.

The second point is more far-reaching. It is doubtful whether we can still describe ourselves as living in the American era or, indeed, the Age of the West. If not yet quite over, both are certainly drawing to a close, and it seems likely that the effect of the financial meltdown will be to accelerate the rise of China as a global power. The contrast between the situation in China and that in the US could hardly be greater, even though it has been partially obscured by the depressive effect of the western recession on Chinese exports and on China’s growth rate. While the US economy is contracting, China’s grew at roughly 9 per cent in 2008 and is projected to grow at about 6 per cent in 2009. Its banks, far from bankrupt like their US counterparts, are cash-rich. China enjoys a large current account surplus, the government’s finances are in good order and the national debt is small. This is a crisis that emanates from the US and whose impact on China has been essentially indirect, through the contraction of western markets. It is the American model that has failed, not the Chinese.

One of the factors that intensified the Great Depression, and indeed was part cause of it, was Britain's growing inability to continue in its role as the world's leading financial power, which culminated in the collapse of the gold standard in 1931. It was not until after the war, however, that the US became sufficiently dominant to replace Britain and act as the mainstay of a new financial system at the heart of which was the dollar. The same kind of problem is evident now: the US is no longer strong enough to act as the world's financial centre, but its obvious successor, namely China, is not yet ready to assume that mantle. This will undoubtedly make the search for a global solution to the present crisis more difficult and more protracted.

Martin Jacques's new column will be published fortnightly in the New Statesman. His book "When China Rules the World: the Rise of the Middle Kingdom and the End of the Western World" will be published in June (Allen Lane, £25)

the global downturn in numbers


    IMF prediction for global growth in 2009 - worst since WWII

    Up to 40 million

    Number of people who will lose their jobs this year, according to the International Labour Organisation


    Total pledged by the US alone towards solving the crisis


    Proportion of GDP pledged by the G7 and BRICs countries towards fixing the crisis (1.5% this year)


    Number of US properties that received a default notice or were repossessed in 2008. In the UK, 45,000 homes were repossessed - another 75,000 are expected to be taken in 2009


    Number of major global banks which collapsed, were sold or were nationalised during 2008


    Number of European companies expected to fail this year; an additional 62,000 are expected to fail in the United States. These figures represent record levels of insolvency


    Increase in UK company failures between late 2007 and late 2008


    Drop in level of Chinese exports during January


    Current UK interest rates (down from 5% in October 2008). In the US, rates have fallen to between 0 and 0.25%

How the crisis unfolded

13 September 2007 Run on Northern Rock begins when it is revealed that the bank has requested emergency support from the Bank of England

21 January 2008 FTSE suffers worst falls since 11 September 2001

February 2008 Northern Rock nationalised

17 March 2008 JP Morgan Chase takes over the US investment bank Bear Stearns

12 July Mortgage lender IndyMac collapses - second biggest US bank in history to fail

9 August 2007 European Central Bank pumps ?95bn into banking market

7 September Financial authorities step in to rescue Fannie Mae and Freddie Mac

9 September Bradford & Bingley becomes second British bank to be nationalised

15 September Lehman Brothers files for bankruptcy

16 September AIG, biggest insurance firm in the US, receives $85bn rescue package

3 October 2008 US government announces $700bn Troubled Assets Relief Programme

8 October UK launches its first bank bailout plan, making £50bn available

October 2008 Iceland's banks collapse. IMF extends £1.4bn ($2.1bn) loan a month later

24 November Alistair Darling announces a temporary cut in VAT from 17.5 to 15 per cent

23 January 2009 UK enters recession

28 January US Congress passes Barack Obama's $819bn stimulus package

5 February UK Monetary Policy Committee votes to cut interest rates to 1 per cent - the lowest in over three centuries

Michael Harvey

Martin Jacques is a journalist and academic. He is currently a visiting fellow at the London School of Economics Asia Research Centre and at the National University of Singapore. Jacques previously edited Marxism Today and co-founded the think-tank Demos in 1993. He writes the World Citizen column for the New Statesman. His new book on the rise of China, When China Rules the World, will be published in June.

This article first appeared in the 16 February 2009 issue of the New Statesman, The New Depression

Show Hide image

The silver scent of fear

Learning to live with epilepsy.

I was swimming in the cool, still water of the lake. I was 12 and it was my second summer at sleep-away camp. New York City is roasting and humid in July and August and so, like many of my peers, I was lucky enough to be sent off to Maine for eight weeks. The trouble was, I didn’t feel lucky. I hated Camp Fernwood – but my mother had gone there, and I was a nice kid, and I didn’t want to let her down. So I spent a lot of time, during those beautiful summers, feeling very anxious.

One afternoon, a different sort of anxiety came over me as I paddled in Thompson Lake. The memory is crystal clear, or so I tell myself. I was not far from the wooden dock. I was on my own. In an instant – a long instant – everything changed. My body changed, for a start: my heart was pounding and my vision narrowed, as if I were staring down a tunnel. I was inside of myself, and outside of myself, in a way that I had never felt before; and in the back of my throat and up towards the bridge of my nose, there was what I will call an elusive silver scent, distant and clean.

The world became a globe of terror. I wasn’t scared. I wasn’t anxious. I knew what those things felt like, and this was something else. Now I was more frightened than I had ever been. I would learn to know that terror well; nothing would ever alleviate it. Familiarity did not bring peace. My brain was making terror. There would be no escape from that.

None of these words is adequate to describe what happened to me then. Nearly 40 years have passed and I have never found the words to capture the sensation of that first seizure – and every seizure since. That first time, I didn’t know I was having a seizure. I didn’t know the word “epilepsy”. I pulled myself out of the water, somehow getting to the dock and up on to dry land. I didn’t tell anyone, just then. Everyone knew me as a worried, pain-in-the-arse kid, anyhow. Why make things worse?

A few days later, I went to see the camp nurse and told her what had happened to me. (I loved going to the camp nurse. If you were in her little cabin, you didn’t have to play tennis or softball or sing camp songs.) That summer, the nurse had her husband with her, a doctor, who was taking his summer vacation by the lake shore in Maine. She called him into the room with us and he listened. Eventually I saw my own doctor. Not long after that, my mother and I sat in the office of a paediatric neurologist. He was the first person who said epilepsy to me.

It is only now, in retrospect, that I realise how lucky I was that my mother – who was even more anxious than I, in general – did not seem unduly alarmed. At 12, I had no idea that, for many with the condition and their families, epilepsy casts a dark shadow; that a diagnosis carries the legacy of the days when sufferers were not allowed to marry, or were confined to lunatic asylums.




In the United Kingdom, there are about half a million people with epilepsy, although the term can mean many ­different things. There are more than 40 different kinds of seizure and these can be divided into two broad groups: focal seizures (which are also called partial seizures) and generalised seizures. Figures vary, but roughly two-thirds of those with epilepsy have focal seizures and a third have generalised seizures. They are surges of electrical activity in the brain. The pioneering British neurologist John Hughlings Jackson, who died in 1911, put it succinctly: “Epilepsy is the name for occasional, sudden, excessive, rapid and local discharges of grey matter.”

My episodes involve simple focal seizures that happen in the temporal lobe of my brain. Generalised seizures affect the whole brain and cause a loss of consciousness – the muscles of the body may relax completely, or they may jerk and cause the person to convulse. The latter is perhaps the “classic” idea that most people have of epilepsy, and it is the image that has led to epileptics (a term that is disputed) facing discrimination, throughout history and in many cultures.

In the ancient world, it was sometimes known as the “sacred disease”, but as early as 400BC physicians began to believe that epilepsy might have an organic, rather than a divine, cause. Julius Caesar’s collapse in the heat of battle in 46BC has been attributed to a seizure (though it has recently been argued that he had a series of mini-strokes); Joan of Arc’s visions may have been the result of epilepsy; the visual and auditory hallucinations of Vincent Van Gogh might have been caused by the condition; Dostoevsky has been described as the best-known epileptic in history.

As Colin Grant writes in his fine new book about the condition, A Smell of Burning, people with epilepsy are often presented with a list of this sort, as if it offered encouragement: “Look at Van Gogh, look at Caesar, look at the abolitionist Harriet Tubman – they still got on with their lives.” But this can be cold comfort. Aside from the way in which epilepsy (especially generalised seizures) can limit a person’s life, there is still a great deal of stigma attached to the disease, even in the 21st century.

It is a stigma that Ley Sander has encountered often. Sander, a Brazilian who has lived in the UK for 30 years, is a professor of neurology and clinical epilepsy at University College London; he has been the medical director of the Epilepsy Society since 2012 and also leads the World Health Organisation Collaborating Centre for Research and Training in Neurosciences in London.

He is a charming man, with bright eyes and salt-and-pepper hair. His easy smile and mischievous sense of humour put both patients and journalists at ease – but he is serious when it comes to the treatment of epilepsy and the discrimination that his patients can face. Fellow physicians are often startled that he has chosen to specialise in the disorder. They assume that he must have a personal or familial connection to epilepsy. He does not.

“It’s still a hidden condition,” Sander says. “People don’t have a problem talking about Parkinson’s, or HIV, but epilepsy – not yet. That’s very common in all sorts of societies. It remains in the shadows. I have a number of eminent people who come to my clinic, from all walks of life, and as soon as you talk to them about ‘coming out’, or being a role model, they refuse to be involved.

“I had a situation not long ago, with one very eminent person. I thought I had persuaded this person to speak out. But within two or three hours of our conversation, I had his agent on the phone, saying he was going to sue me for breach of medical confidentiality. I had not done anything – we had only discussed it.”




We are sitting in Sander’s airy office at the Chalfont Centre in the village of Chalfont St Peter, Buckinghamshire. The centre, a complex of nondescript buildings ten minutes’ drive from Gerrards Cross, is much more remarkable than it initially looks, as I discovered when I first visited as a patient in the spring of 2015. After I was diagnosed with epilepsy at 12, I remained on medication until I was in my early twenties, but gradually weaned myself off the tablets when it became apparent that my seizures had disappeared. This is fairly common in juvenile epilepsy. Then, a couple of years ago, without warning, they returned, like a troublesome friend from my youth showing up on Facebook, certain that we’d want to be mates again.

The seizures seemed identical to what I had experienced when I was so much younger – the same, indescribable disorientation and terror. I wish I could better express the way they feel: like being shut out of one world and shoved into another, or like shooting down some kind of wormhole of consciousness.

For about 20 minutes after they occurred, I would lose language. The names of places or people I knew as well as my own, would vanish. In the aftermath, there came a kind of exhaustion that perhaps best resembled a hangover; my husband would tell me that I looked pale and drawn. Because I am a writer, I found the brief aphasia the most upsetting aspect. What if the words never came back? They always did, but that never diminished the fear.

Occasionally I had a seizure in public – while teaching, say, or doing an interview – and I would cover for my sudden silence, my sudden pallor, by saying as soon as I could that I was very tired, that I’d had a bad night, that I was sorry. It was a measure of friendship if I felt that I could tell someone what was going on. I would feel better if I could be touched, if I my hand could be held, if I could feel another’s physical presence. Worst of all and most fearful was to be alone, in an empty house. Were you scared when you saw The Shining? Right. Like that.

I looked for a trigger – did they come when I was particularly stressed? When I was especially relaxed? There was no pattern, at least not one I could discern.

My GP sent me to the National Hospital for Neurology and Neurosurgery in Queen Square, London. There I met Professor Sander and his colleagues – and perhaps, if I’m honest, I’d expected them to send me away with reassurances that my seizures were nothing to worry about. Was this because I didn’t wish to acknowledge that my epilepsy had returned? I suppose so, though I had never felt the stigma of the condition, at least consciously. (In 2007 I published a novel called Seizure, which I don’t think I would have done if I’d wanted to keep quiet about the whole business.)

Yet anything that affects the brain in the way that epilepsy does can’t be brushed aside. The doctors at Queen Square took my condition very seriously. I was put back on medication straight away and sent for two days of testing at the Sir William Gowers Centre, which is part of the Chalfont Centre. An NHS facility, it is run by a partnership between University College London Hospitals and the Epilepsy Society. I was affected by the level of care I saw there – from doctors, nurses, support staff. Many patients, more badly affected by epilepsy than I am, were there for many days or even weeks as their condition was monitored.

The unit has 26 beds and offers video-EEG telemetry (in which the electrical activity of the patient’s brain is monitored while he or she is being videoed), advanced MRI scanning, drug-level monitoring, neuro­psychiatry and psychology. Each year, it admits over 1,300 patients from all over the UK and Ireland for assessment and treatment. Although its low buildings are nothing special to look at, its comfortable sitting room opens out on to a beautiful view of the Icknield Way, an ancient pathway that runs from Buckinghamshire to Norfolk.

The centre is one of the world’s oldest facilities for the treatment of epilepsy. The National Society for the Employment of Epileptics (now the Epilepsy Society) was founded in London in 1892; its first task was to establish a “colony” where people with the condition could live and work, because this was a time when words such as “degenerate”, “idiot” and “lunatic” were used almost interchangeably with “epileptic”.

On the walls, there are black-and-white photographs of early-20th-century residents shoeing horses, ironing and playing golf or football. Back in those days, when the place was primarily residential, rather than diagnostic, there were as many as 450 people living there. Now there are just 90 permanent residents, Sander tells me. They must be severely affected by the disorder to qualify for admission.

But understanding the condition – even in the 21st century, when it seems that medicine is so advanced – is extremely difficult. Sander, one of the leading experts in the field, confesses that offering treatment too often feels like firing “a blunderbuss”. Drugs are designed to work for a wide variety of conditions; as he tells me, drug companies want a product that works as broadly as possible, because that will bring in the most income. If you have to develop drugs that are designed for a small number of patients, that’s very expensive.

Furthermore, the causes of epilepsy – like so much else about the workings of the brain – are still little understood. Seizures happen when there is a sudden interruption in how the brain normally works but what provokes this is often a mystery, unless fits are brought on by brain injury or a tumour. Epilepsy may be hereditary but this, too, can be hard to discern, as the condition was often kept secret in families.

“I myself feel like a shaman at times,” Sander says, “because you are working in the dark and you hope that what you do will work. Dear Mary, I say, or dear John, I know you have this seizure type; we’ll try this drug and it may work. We don’t know why, if it does; and in the best-case scenario I can offer a 50 per cent chance that it will work. So I could say that even if I tried herbal tea with that person, I might get the same outcome.”

Sander told me that he didn’t expect to see or find anything in the tests I had at Chalfont: a 24-hour EEG, an MRI scan, memory and psychological tests. But, he said, at least if something about my condition changed for the worse in the future, we would have a baseline from which to work.




Even when drug treatment is successful, there can be problems. Colin Grant’s book is not only a history of epilepsy and the way it has been perceived and treated across cultures and centuries; it is also the story of his younger brother Christopher, who died as a result of epilepsy nearly a decade ago. A Smell of Burning paints a portrait of Christopher as a vivid and original young man who resisted treatment for his condition because the drugs he was given left him, as neurologists say, “knocked off”: dulled, sedated, his sense of self disrupted.

“Many people I spoke to said they would rather risk the occasional fit, or seizure, and be fully, 100 per cent alive and articulate than have a life that was – well, living at only 80 per cent,” Grant tells me when we meet. “I think that’s a very human response. But with Christopher, it drove his doctors and my siblings and my parents mad. They couldn’t understand it.”

It is Sander’s hope that the blunderbuss approach that Christopher resisted will change in the next decade or so. “It’s very important to put epilepsy in context,” he says. “Epilepsy is not a disease on its own. It’s a symptom, really a complex of symptoms. So in the old days, for instance, anaemia was a symptom complex” – that is, the aggregate of signs associated with the whole picture of a disease – “[but] it’s now just a symptom. We wouldn’t assess someone saying, ‘We’re going to find out why you have anaemia.’ We want to know what the anaemia is a symptom of, and then have a treatment for the cause. We have not reached that stage with epilepsy. Things will change in the next five or ten years, with progress in genomics – and then we’ll have a much better diagnosis.”

Yet even today, without such developments, when it comes to finding out the causes of epilepsy and how it might best be treated, the Sir William Gowers Centre offers a high level of sophistication. Magnetic resonance imaging (MRI) uses strong magnetic fields and radio waves to produce detailed images of the inside of the body; many hospitals have this technology but, as Sander explains, imaging departments may have to do heads, fingers and livers, all in a day. “So you might not be able to do the most protocols for imaging as you can do in a place that specialises. Our scanner is set up to do epilepsy only. A good analogy is with an orange: if you slice an orange in two planes, you’re likely to miss a seed, especially if you do your slices 5mm apart. But if you do a scan in several planes, and you do it to half a millimetre, you’ll find the seed.”

Some forms of epilepsy can be treated with surgery and the Chalfont Centre is the main facility in the UK for those who undergo these procedures. Sander sounds a note of caution. “Many patients, when they arrive, have spoken to Dr Google, and so they hear that this treatment is out there. But often [they have] very unrealistic [expectations]. More often than not, I have to tell them, ‘Sorry, you are not a candidate for this.’ Or someone is a good candidate, but they’re afraid.”

The neurosurgeon Henry Marsh echoes Sander’s sentiments. There is “no reliable data” on the percentage of patients who are suitable for such surgery, “partly because it is a question of judgement as to when epilepsy is judged ‘refractory’ – ie, not responding adequately to drug treatment –and also how early on you should consider surgery in such cases. Probably fewer than 5 per cent of people with epilepsy will be considered for surgical treatment,” he says.

Deciding to operate – as Marsh writes in his memoir, Do No Harm – is always a hugely complex, if not the most complex, part of the process. To come to such a decision, “You need an epilepsy neurologist, a neurosurgeon, a psychologist, a neurophysiologist and a neuro-radiologist. You need to find where the epilepsy is coming from. It is not always coming from an abnormality seen on the brain scan. You may need to insert electrodes into the brain, or on to the surface of the brain, to try to trace where the fit starts. You then need to decide whether it is safe to remove that part of the brain.”

Colin Grant observes this caution directly when, in the course of researching A Smell of Burning, he attends a review meeting at Queen Square of the kind that Marsh describes. Six cases are discussed; none is put forward for surgery. The team, he writes, “had erred on the side of ‘bad brain is better than no brain’”.

For the rest, such as myself, there is the prospect of a lifetime on anti-epileptic drugs. This works for about 70 per cent of patients, according to the Epilepsy Society. I am fortunate that my treatment has been successful and smooth. My seizures have stopped completely and I can sense – I don’t quite know how – that I won’t have one. I realised that, after my seizures returned (and before I went back on medication), they were always in the offing, even if I wasn’t having one. This is hard to explain, but now that I’m on medication, I just know the seizures aren’t “there”. I now see Professor Sander as a patient only once a year.

There are, however, complications to treating epilepsy other than the problems of non-compliance and the risks of surgery. Cultural attitudes to the condition vary widely and, as both Grant and Sander relate, even today there are many people who believe that epilepsy is a result of spirit possession or a curse. Grant’s family members were devout churchgoers and belonged to a Pentecostal congregation. When Christopher was 19 he had a seizure one Sunday morning. Grant writes that he arrived at church to find the congregation “weeping and wailing whilst the two elders called upon God to free Christopher from the devil’s grip”.

This is a situation that Sander confronts more often than you might think. He tells me the story of a young man who works in the City. “He has epilepsy, and he’s my patient. It was very difficult to convince him about drugs until I found out I could say, ‘Well, this drug – djinns don’t like it.’ He comes from an Asian background and his aunties [and] his mother would say, ‘This a djinn,’ when he had a seizure. So I promised him that the djinns don’t like this drug. And he came back and said: ‘You were right.’ But one of my registrars at the time argued that this was unethical, to engage with this belief. I said to the registrar that I’m only with the patient for 15 or 20 minutes. He will go back to his mother, his aunties; they will carry on talking for the next six months about the djinns. So I don’t stand a chance unless I do, too.”

Grant says almost exactly the same thing to me about his own mother. “My way of thinking would jar with her. She has a way of understanding that’s developed over many, many years. You can’t disabuse someone of that overnight.”




I understand the resistance to the term “epileptic”. It implies that the condition is definitive; that the whole person – my whole person – is folded ­inside the experience of seizure. Those with the condition have fought hard, over centuries, over millennia and into the present day, to live ordinary lives, to hold down jobs, to marry, to have children.

Yet I accept the term, too. I know that I would not choose to be without it. Certainly, I would not be who I am, who I consider myself to be, without it. I think it was what made me a writer: not only because I have tried and failed, over and over again, to describe what is going on inside my skull when I have a seizure, but also because I feel it has given me a profound understanding of the subjective nature of consciousness.

Confronted with the great difficulty that so many with epilepsy face, I know this seems like speaking my privilege, as the saying goes. Yet this is the truth of my experience. Maybe, I find myself thinking, it is the truest thing about me.

For more information about the condition, visit:

Erica Wagner is a New Statesman contributing writer and a judge of the 2014 Man Booker Prize. A former literary editor of the Times, her books include Ariel's Gift: Ted Hughes, Sylvia Plath and the Story of “Birthday Letters” and Seizure.

This article first appeared in the 29 September 2016 issue of the New Statesman, May’s new Tories