MURDO MACLEOD
Show Hide image

Where the bodies are buried

Whether you’re alive or dead, Sue Black knows who you are – as dozens of murderers and war criminals have discovered.

Even before she became an anatomy student, Sue Black was used to death. From the age of 13 she had worked every Saturday at a local butcher’s shop. On cold days, she would rush to pick up the livers from the incoming vans, the fresh organs warming her hands in the cold Scottish winter.

By the time she arrived at the University of Aberdeen, having lied to her worried parents that she had secured a full grant, she was already familiar with bones, blood and flesh. But what she saw inside David – the nickname she gave to the cadaver she was instructed to dissect – was very different.

She calls the inside of the human body an “amazing world”, a life story written in skin and tissue. Stretching out her pale forearms – she is red-haired and “tans as well as a snowball” – she shows me her freckles. Their ­position was decided in her mother’s womb: the cells settled in a layer of skin called the basal lamina, waiting to be activated by sunlight. “If you stay indoors and you never go outside,” she says, “well, you’ll always remain pale and interesting.”

Black, now 54, has made her career painstakingly learning to read these human stories. She is now Professor of Anatomy and Forensic Anthropology at the University of Dundee and one of Britain’s leading experts in human identification. She sees bodies that betray their owners – the veins on a paedophile’s hand, for example, which are more distinctive than a fingerprint – and bodies whose marks and scars become testimonies to murders and war crimes. She cannot help looking at the world as an anatomist: it always annoys her that political cartoonists put the gap in Tony Blair’s teeth in the wrong place.

Three deaths influenced Sue Black’s childhood and set the pattern for her career. The first was that of her grandmother – a tough old woman who, when she knew she was dying, told the young Susan that whenever she needed advice, she could turn to her own shoulder and talk to her. (She still does.) The second was a young mother called Renee MacRae, who went missing in 1976 with her son Andrew near Inverness, where Black grew up. “I can remember the police coming round and asking my father to look in the outhouses,” she says, her hands cradling a cup of tea in her university office.

The officers found no trace of MacRae and her son, there or anywhere else. The case remained dormant until 2004, when a new chief constable decided that there was enough intelligence to excavate a local quarry. Black was involved with the search but after the police moved tonnes of earth, they uncovered only a few bones – which belonged to a rabbit. The disappearances are now Scotland’s longest-running missing persons case. “Those kinds of things get under your skin,” she says. “You think there’s a family sitting with their life, in part, in a stutter. They just want their sister back . . . Whoever killed her is the only person, I suspect, who knows where she is.”

The final death that changed the young Sue’s life was that of a rat, beaten to death by her father, who had found it scavenging outside the hotel that he ran on the shores of Loch Carron. She remembers its eyes, its teeth, its tail, its fevered thrashing as it died. It left her with a fear of rodents, so she was stumped when, on reaching the fourth year of her anatomy degree, she was told to dissect the brains of hamsters and mice. She convinced her tutor to let her study human bones instead – and never looked back.

***

What Sue Black does is easy to explain but sometimes difficult to accomplish: she finds out who people are or, more often, were. After training as an anatomist, she was employed by the Foreign and Commonwealth Office and travelled to Kosovo, Sierra Leone, Thailand and Iraq to help identify the bodies of those killed in natural disasters and massacres. Her first big mission came in 1999, when a colleague, Peter Vanezis, was asked to collect evidence in Kosovo for a possible war crimes tribunal. He arrived at a barn in the village of Velika Kruša, in the west of the country, and found it filled with 42 decomposing bodies. He told his superiors that he needed help. He needed Sue Black.

She was by then the mother of three children, aged 15, five and three. With her husband working full-time and her parents living 120 miles away, she hired a nanny and got on the next plane. It was not a hard decision. “The girls have grown up knowing that we adore them but they also know that their dad has a life and their mum has a life, the same as they will have a life – or they do have a life now, because they’re much, much older.”

What she found in Kosovo was a scene of horror. There was a survivor from the barn massacre – a man who had made it to the corner of the room and had been shielded by his friends as Serbian troops sprayed the men with bullets, then tried to set the barn on fire. He lay still under their bodies until it was safe to emerge, many hours later. Black’s job was to see if the physical evidence corroborated his story.

That involved sifting through the remains with her fingertips, working on bodies that had been burned and partly eaten by local dogs and were now a boiling mass of maggots. There was no running water on site and there were snipers in the hills. There were also no toilets. On the first day, one of the police officers on the mission returned from the tree that the team had been using as a makeshift loo, beaming from ear to ear. He had found himself urinating on an explosive device. It had a tripwire that would have triggered if anyone walked down the road away from the barn, killing or severely injuring them. But the man was thrilled: at his age, he had managed to stop mid-flow as soon as he saw it.

During her time in Kosovo, Black took on the role of the team’s surrogate mother. “Everybody kicks in to a professional mode the minute you get into the car and you’re heading out to an event,” she says. “But when you’re in your lodgings at night, when people are being people rather than being professionals, there’s a different dynamic that goes on.” In that role, she says, she could tell them to stop drinking, have a proper meal, or go to bed. “And those buttons are ones that a mother can hit. What becomes quite disruptive within a team is when you have single, available, attractive women and you have men.”

She also helped the rest of the team deal with the emotional demands of the job. Once, she was conducting a post-mortem in a field. The subject was a toddler, still in red booties and a sleepsuit. Soldiers had chased the village children into the field and then used their heads for target practice while the adults were made to watch. Pausing for a moment from her work, she looked up and saw a line of policemen’s boots. One of the officers had broken down – he had a toddler at home – and his colleagues were sheltering him until he could continue. Black, however, was having none of it. She stood up and threw her arms around him, allowing him to cry in the open. Then she told him that he had to keep his work and home life separate.

When she is working on a difficult case, she has a mantra: “You didn’t cause this, you didn’t do this, you’re not responsible.” She keeps her professional life in the “work box” and, because of this, she professes never to have had a sleepless night as a result of the things she has seen. The crime writer Val McDermid, who has known Black for 20 years, says that she is “very good at compartmentalising . . . It’s that ability to not bring her work out of the building that makes it possible for her to survive.”

***

For the first half of her career, Black was mostly concerned with identifying the dead. But it can be just as important to identify the living – as in the case of Scotland’s largest paedophile ring.

Some time between 2005 and 2007, a man called Neil Strachan, who worked as an engineer with Crown Paints in Edinburgh, attached a personal hard drive to a computer at work. He forgot all about it, until one day the computer was sent away for repair. On the hard drive, the technician found a sexually explicit photograph of a child.

That discovery set off a chain of raids and arrests, leading to the trial of a group of men who had met online to swap indecent images and boast about their exploits. One of Strachan’s contacts, a man called James Rennie, had an email address beginning “kplover”, standing for “kiddie porn lover”. When the case was coming to trial, though, the police faced a challenge. Strachan had sent messages to Rennie indicating that he was not only looking at child sex abuse images but abusing children. “I might have found us a contact with two boys, two and four, willing to share,” he wrote once. Another time, he boasted of “having fun” with an 18-month-old boy; police found a picture of a man abusing a child roughly that age around New Year, which became known as the “Hogmanay image”. They desperately wanted to know if Strachan was the man in the photograph, because the penalties for making child pornography are far greater than those for merely viewing it.

But how? The images didn’t show the man’s face. For some unknown reason, however, the defence counsel had taken images of Strachan’s thighs – and although his legs were entirely unremarkable, in one of the images he was holding the photographic scale. And there, on his thumb, was the mark that betrayed him. He had a deformation of the lunula, the crescent-shaped white area at the base of the nail. So did the man in the Hogmanay image. The evidence went to court and in 2009, Strachan was convicted of the ­attempted rape of the 18-month-old and sentenced to life.

Black and her team now examine dozens of similar images every year and in 80 per cent of the cases they work on, their identification of an anatomical feature convinces the defendant to change his plea to guilty. She is the only member of the team who has children and again the mantra – “This is  not something you caused . . .” – helps her, as does her day job in the dissecting room. “When you’ve worked in anatomy, where you spend your life with the deceased, when you then work in forensic anthropology, where you see individuals in all sorts of circumstances, whether it’s in burnings, whether it’s in explosions, whether it’s in murder, suicide, whatever it may be, all of these serve to help you find that ability to retain a detachment.”

Some of Black’s opinions are unexpected, such as her belief that defendants in rape and child abuse cases should not be named unless they are found guilty. “I can’t think of anything worse for a man than to be wrongly accused of being a child abuser,” she says. “Once that label’s been put on you . . . even though you’re found innocent, in the public’s mind there is still always this: ‘Is there no smoke without fire?’” She is wary, too, of investing too much in cases and feeling tempted to overegg the science or her certainty. “It’s incredibly important that we only say things that are backed up by research, because to put the wrong person on the wrong side of bars is unacceptable. That’s not justice working, that’s injustice.”

In almost all of her work, the forensic evidence is just part of a larger case built by the police. This can have unexpected consequences, as in an early case that used vein pattern analysis. “The very first one we did was a case of alleged child abuse where the girl alleged that her biological father was abusing her and she – bless her – had her Skype camera on her computer. And I don’t know if you know, but if you run it in night mode, it goes into infrared, so you had infrared capture through the night. And a picture was picked up on the camera at about half past four in the morning of a hand coming in and interfering with the girl under the covers.”

The infrared camera picked up the perpetrator’s hand and, from her years in the lab, Black knew that the veins that were visible were very distinctive. Her team compared the blood vessels in the images with the defendant’s. They matched. “But what I had no research on – and didn’t present [in court] – was what the likelihood was of anybody having the same veins, because we simply didn’t know,” she says.

After some back and forth between the judge, the prosecutors and the defence, the vein match was ruled admissible. “So the jury heard it. The jury then went away and they came back with a not guilty verdict.”

Black and her team wondered what they had done wrong, so they sent a note to ask whether the jury had not been convinced by the untested technique. “They said, ‘Oh, no, we had no problem with the science, that was fine.’” The trouble was that the members of the jury did not believe the girl, whom they had found to be too composed in the witness box. She sighs. “She was a young teenager. Who else would be in her room at half past four in the morning? But, you know, that’s not our case.”

***

Since then, Black and her team have discovered that the veins in the hand are, as they suspected, highly distinctive – even in identical twins. (Earlier, she told me with relish: “That’s the wholly wonderful thing about identical twins – that the one thing that they are not is identical.”)

This new information provides police with a more reliable method of identification than many of the better-known forms. In Scottish courts now, for instance, fingerprint matches are treated as matters of opinion rather than fact. This follows an inquiry into an eyebrow-raising case in which a police detective called Shirley McKie was suspended, then sacked, then charged with perjury, after her fingerprint was apparently found on a door frame at a murder scene, although she denied ever visiting it. Her father, a retired detective, took up the case and McKie was eventually acquitted and awarded £750,000 in compensation. It seems likely that although her prints matched those at the scene on all the points that had been sampled, they were not identical.

“It took her many, many years to prove that, in fact, the way in which fingerprints were being assessed was fundamentally flawed, so that all cases where convictions relied on fingerprints were now in jeopardy,” Black says. Other staples of forensic science, such as gait analysis, now face similar questions. “In America at the moment, they’re having horrendous problems – and we’re not surprised – with bite marks.”

She is also dismissive of iris identification, because it is possible to make a good-quality replica of an eyeball on acetate and print it on a contact lens. “If you can spoof the biometric, then ultimately it’s not a very good biometric. And they’ve now been able to spoof irises. Spoofing of fingerprints is child’s play now.”

Such concerns are why Black talks about a “crisis” in forensic science. For many years, DNA evidence has been a kind of deus ex machina in criminal cases – the DNA has spoken: that guy did it – but matches are based on probability rather than certainty and the modern techniques used to isolate very small strands of DNA are open to contamination.

Other types of evidence are prone to misunderstanding. In February 2014, she brought together a group of forensic scientists to discuss the limitations of their work. Without the scientists’ knowledge, Black also asked several senior judges and lawyers to attend. “We have two key players in the forensic world who only ever meet in an adversarial position, so they’re never, ever going to understand each other,” she says. “So, by the scientists being open and honest and not realising the judges were in the room, the judges were going, ‘Oh, my goodness, this is what the scientists think. Ooh!’”

The result of the meeting was that the scientists and lawyers agreed that 40 evidence types needed attention. “And that went from DNA, fingerprints, footwear marks, gunshot residue, bite marks – you go through the whole list – that said either we’ve got a problem in detecting it, or recognising it, or comparing it, or evaluating it, or communicating it.”

The scientists are now producing primers, written in simple English, to help juries and judges better understand the science they are being asked to weigh up. “That’s probably the biggest ever project attempted in public engagement with science, if you think that’s taking science into every single courtroom in the land, every single day.”

***

Alongside these grand plans, Sue Black’s attention in the past few years has been on a project closer to home. When I visit Dundee on a wind-whipped December day, the department is humming with quiet industry: there are students (95 per cent of them female), mortuary assistants and colleagues in Christmas jumpers. And there are bodies.

When Black arrived at Dundee in 2005, anatomy departments were in decline – they were either closing down altogether, or moving to “prosection”, in which an instructor dissects a cadaver in front of the class. But she is an evangelist for the importance of hands-on experience, and the department receives 80 new bodies every year for its students to cut into and explore.

Val McDermid was one of a group of crime writers who agreed to help Black raise the funds for a new mortuary a few years ago. They asked their fans to vote for a room to be named after them and to pay a pound to do so. It’s clear who won, as Sue Black guides me into the “Val McDermid Mortuary” and then to the “Stuart MacBride Dissecting Room”. The other eight writers each got their name on an embalming tank, with the exception of Lee Child, who decided to use that of his lead character Jack Reacher instead. “We realised early on we couldn’t have the Child Mortuary,” says Black dispassionately.

The dissecting rooms are cool, and – to my surprise – smell of very little, not even disinfectant. The air-conditioning draws the air downwards and the new Thiel embalming method stops the bodies from decomposing. This has been Black’s pet project for the past half-decade, as formalin, the old embalming fluid, is known to be carcinogenic and leaves dead bodies stiff and unyielding. Other departments tried “fresh frozen” – dismembering a cadaver and defrosting each section as it was needed. Black thought that this was “incredibly wasteful of the gift”, because each body part has a usable life of just a few days, and wasteful of money, too, because limbs and organs had to be bought in from abroad. “You could have 12 legs come in, shipped into Heathrow. They would carry a health certificate that they’re free from everything – I’m sorry, but I’d want to check – and then they’d go off and be dissected. Incredibly expensive.”

Black’s preferred alternative is the Thiel method, named after the Austrian anatomist Walter Thiel, which involves soaking bodies in a mixture of salts, chemicals and a smaller measure of formalin. It keeps the bodies soft and pliable, which Black says works better for everyone except trainee neurosurgeons and colorectal specialists (a living gut has more tension). McDermid says that the Thiel cadavers “look like people – albeit slightly strange, with no hair or fingernails. For the students, that’s a huge advantage, because it gives them a sense of what they are going to be working with in a way the old bodies didn’t.”

Downstairs, two of the department’s mortuary assistants, Claire and Sam, are dressed in scrubs and wellies, preparing a body using the Thiel method. The cadaver is propped up, almost upright, on a table, with tubes running into the top of his head and out of his thigh. He looks peaceful; the scene is not in the least Gothic. “I do tend to talk to them,” Claire says. “I applaud them if they have very good veins.” What’s the difference between picking up a live patient and a dead body? “The bodies are heavier, because they’re not helping you,” Sam says.

Black and her PA, Vivienne McGuire, meet many of the cadaver donors while they are still alive, offering them a cup of tea in her office, which is spangled with plaques and knick-knacks. (“To save time, let’s assume I know everything,” reads one slogan. “My job is secure – nobody wants it,” offers another.) There’s a skeleton in the corner, which might eventually be replaced with Black: she has said that she would be delighted to become a teaching aid in her old department one day.

There are many reasons why people agree to donate their bodies. For some, it is as simple as wanting not to burden their families with the £3,600 that the average funeral costs. Others want to pay back the medical profession, or hope to train doctors to cure the disease that killed them. As they leave her office, Black tells the donors, “Now, don’t take this the wrong way, but we really don’t want to see you soon.”

She takes me upstairs and shows me the book of remembrance: the donors for 2014 included Shelagh, James, Irene and Angus. On the first Wednesday of May every year, the department holds a memorial service for donors’ families, attended by the staff and students. “I found it quite moving to go into the mortuary and see the cadavers,” says McDermid. “There is a sense of respect for the people who have donated their bodies. This is not Doctor in the House. There’s no larking about in Sue’s mortuary.”

Throughout her career, Black has been close to death, often involving the most traumatic circumstances. Yet she is one of the most serene, untroubled people I have ever interviewed; serious when the occasion demands it but ready to laugh. “Her students are utterly devoted to her,” McDermid says. “It’s extraordinary. They’d walk on hot coals for her.”

Perhaps the cliché is true: contemplating death really does make you feel more alive? “It’s my view that we have, as a society, removed ourselves from death,” Black says. “We’ve built a wall around it that makes us uncomfortable, whereas if you go back just a few generations, when Granny died she was in the coffin in the front room. It was viewed as just as natural as birth.”

On my way out of the building, I think: I wouldn’t mind if my final resting place were Sue Black’s mortuary. I pull my coat around myself, happily, and walk out into the cold winter sunshine. 

Helen Lewis is deputy editor of the New Statesman. She has presented BBC Radio 4’s Week in Westminster and is a regular panellist on BBC1’s Sunday Politics.

This article first appeared in the 21 January 2016 issue of the New Statesman, The Middle East's 30 years war

Show Hide image

Why Jeremy Corbyn is a new leader for the New Times

In an inspired election campaign, he confounded his detractors and showed that he was – more than any other leader – in tune with the times.

There have been two great political turning points in postwar Britain. The first was in 1945 with the election of the Attlee government. Driven by a popular wave of determination that peacetime Britain would look very different from the mass unemployment of the 1930s, and built on the foundations of the solidaristic spirit of the war, the Labour government ushered in full employment, the welfare state (including the NHS) and nationalisation of the basic industries, notably coal and the railways. It was a reforming government the like of which Britain had not previously experienced in the first half of the 20th century. The popular support enjoyed by the reforms was such that the ensuing social-democratic consensus was to last until the end of the 1970s, with Tory as well as Labour governments broadly operating within its framework.

During the 1970s, however, opposition to the social-democratic consensus grew steadily, led by the rise of the radical right, which culminated in 1979 in the election of Margaret Thatcher’s first government. In the process, the Thatcherites redefined the political debate, broadening it beyond the rather institutionalised and truncated forms that it had previously taken: they conducted a highly populist campaign that was for individualism and against collectivism; for the market and against the state; for liberty and against trade unionism; for law and order and against crime.

These ideas were dismissed by the left as just an extreme version of the same old Toryism, entirely failing to recognise their novelty and therefore the kind of threat they posed. The 1979 election, followed by Ronald Reagan’s US victory in 1980, began the neoliberal era, which remained hegemonic in Britain, and more widely in the West, for three decades. Tory and Labour governments alike operated within the terms and by the logic of neoliberalism. The only thing new about New Labour was its acquiescence in neoliberalism; even in this sense, it was not new but derivative of Thatcherism.

The financial crisis of 2007-2008 marked the beginning of the end of neoliberalism. Unlike the social-democratic consensus, which was undermined by the ideological challenge posed by Thatcherism, neoliberalism was brought to its knees not by any ideological alternative – such was the hegemonic sway of neoliberalism – but by the biggest financial crisis since 1931. This was the consequence of the fragility of a financial sector left to its own devices as a result of sweeping deregulation, and the corrupt and extreme practices that this encouraged.

The origin of the crisis lay not in the Labour government – complicit though it was in the neoliberal indulgence of the financial sector – but in the deregulation of the banking sector on both sides of the Atlantic in the 1980s. Neoliberalism limped on in the period after 2007-2008 but as real wages stagnated, recovery proved a mirage, and, with the behaviour of the bankers exposed, a deep disillusionment spread across society. During 2015-16, a populist wave of opposition to the establishment engulfed much of Europe and the United States.

Except at the extremes – Greece perhaps being the most notable example – the left was not a beneficiary: on the contrary it, too, was punished by the people in the same manner as the parties of the mainstream right were. The reason was straightforward enough. The left was tarnished with the same brush as the right: almost everywhere social-democratic parties, albeit to varying degrees, had pursued neoliberal policies. Bill Clinton and Tony Blair became – and presented themselves as – leaders of neoliberalism and as enthusiastic advocates of a strategy of hyper-globalisation, which resulted in growing inequality. In this fundamental respect these parties were more or less ­indistinguishable from the right.

***

The first signs of open revolt against New Labour – the representatives and evangelists of neoliberal ideas in the Labour Party – came in the aftermath of the 2015 ­election and the entirely unpredicted and overwhelming victory of Jeremy Corbyn in the leadership election. Something was happening. Yet much of the left, along with the media, summarily dismissed it as a revival of far-left entryism; that these were for the most part no more than a bunch of Trots. There is a powerful, often overwhelming, tendency to see new phenomena in terms of the past. The new and unfamiliar is much more difficult to understand than the old and familiar: it requires serious intellectual effort and an open and inquiring mind. The left is not alone in this syndrome. The right condemned the 2017 Labour Party manifesto as a replica of Labour’s 1983 manifesto. They couldn’t have been more wrong.

That Corbyn had been a veteran of the far left for so long lent credence to the idea that he was merely a retread of a failed past: there was nothing new about him. In a brilliant election campaign, Corbyn not only gave the lie to this but also demonstrated that he, far more than any of the other party leaders, was in tune with the times, the candidate of modernity.

Crises, great turning points, new conjunctures, new forms of consciousness are by definition incubators of the new. That is one of the great sources of their fascination. We can now see the line of linkage between the thousands of young people who gave Corbyn his overwhelming victory in the leadership election in 2015 and the millions of young people who were enthused by his general election campaign in 2017. It is no accident that it was the young rather than the middle-aged or the seniors who were in the vanguard: the young are the bearers and products of the new, they are the lightning conductors of change. Their elders, by contrast, are steeped in old ways of thinking and doing, having lived through and internalised the values and norms of neoliberalism for more than 30 years.

Yet there is another, rather more important aspect to how we identify the new, namely the way we see politics and how politics is conceived. Electoral politics is a highly institutionalised and tribal activity. There have been, as I argued earlier, two great turning points in postwar politics: the social-democratic era ushered in by the 1945 Labour government and the neoliberal era launched by the Tory government in 1979.

The average Tory MP or activist, no doubt, would interpret history primarily in terms of Tory and Labour governments; Labour MPs and activists would do similarly. But this is a superficial reading of politics based on party labels which ignores the deeper forces that shape different eras, generate crises and result in new paradigms.

Alas, most political journalists and columnists are afflicted with the same inability to distinguish the wood (an understanding of the deeper historical forces at work) from the trees (the day-to-day manoeuvring of parties and politicians). In normal times, this may not be so important, because life continues for the most part as before, but at moments of great paradigmatic change it is absolutely critical.

If the political journalists, and indeed the PLP, had understood the deeper forces and profound changes now at work, they would never have failed en masse to rise above the banal and predictable in their assessment of Corbyn. Something deep, indeed, is happening. A historical era – namely, that of neoliberalism – is in its death throes. All the old assumptions can no longer be assumed. We are in new territory: we haven’t been here before. The smart suits long preferred by New Labour wannabes are no longer a symbol of success and ambition but of alienation from, and rejection of, those who have been left behind; who, from being ignored and dismissed, are in the process of moving to the centre of the political stage.

Corbyn, you may recall, was instantly rejected and ridiculed for his sartorial style, and yet we can now see that, with a little smartening, it conveys an authenticity and affinity with the times that made his style of dress more or less immune from criticism during the general election campaign. Yet fashion is only a way to illustrate a much deeper point.

The end of neoliberalism, once so hegemonic, so commanding, is turning Britain on its head. That is why – extraordinary when you think about it – all the attempts by the right to dismiss Corbyn as a far-left extremist failed miserably, even proved counterproductive, because that was not how people saw him, not how they heard him. He was speaking a language and voicing concerns that a broad cross-section of the public could understand and identify with.

***

The reason a large majority of the PLP was opposed to Corbyn, desperate to be rid of him, was because they were still living in the neoliberal era, still slaves to its ideology, still in thrall to its logic. They knew no other way of thinking or political being. They accused Corbyn of being out of time when in fact it was most of the PLP – not to mention the likes of Mandelson and Blair – who were still imprisoned in an earlier historical era. The end of neoliberalism marks the death of New Labour. In contrast, Corbyn is aligned with the world as it is rather than as it was. What a wonderful irony.

Corbyn’s success in the general election requires us to revisit some of the assumptions that have underpinned much political commentary over the past several years. The turmoil in Labour ranks and the ridiculing of Corbyn persuaded many, including on the left, that Labour stood on the edge of the abyss and that the Tories would continue to dominate for long into the future. With Corbyn having seized the political initiative, the Tories are now cast in a new light. With Labour in the process of burying its New Labour legacy and addressing a very new conjuncture, then the end of neoliberalism poses a much more serious challenge to the Tories than it does the Labour Party.

The Cameron/Osborne leadership was still very much of a neoliberal frame of mind, not least in their emphasis on austerity. It would appear that, in the light of the new popular mood, the government will now be forced to abandon austerity. Theresa May, on taking office, talked about a return to One Nation Toryism and the need to help the worst-off, but that has never moved beyond rhetoric: now she is dead in the water.

Meanwhile, the Tories are in fast retreat over Brexit. They held a referendum over the EU for narrowly party reasons which, from a national point of view, was entirely unnecessary. As a result of the Brexit vote, the Cameron leadership was forced to resign and the Brexiteers took de facto command. But now, after the election, the Tories are in headlong retreat from anything like a “hard Brexit”. In short, they have utterly lost control of the political agenda and are being driven by events. Above all, they are frightened of another election from which Corbyn is likely to emerge as leader with a political agenda that will owe nothing to neoliberalism.

Apart from Corbyn’s extraordinary emergence as a leader who understands – and is entirely comfortable with – the imperatives of the new conjuncture and the need for a new political paradigm, the key to Labour’s transformed position in the eyes of the public was its 2017 manifesto, arguably its best and most important since 1945. You may recall that for three decades the dominant themes were marketisation, privatisation, trickle-down economics, the wastefulness and inefficiencies of the state, the incontrovertible case for hyper-globalisation, and bankers and financiers as the New Gods.

Labour’s manifesto offered a very different vision: a fairer society, bearing down on inequality, a more redistributive tax system, the centrality of the social, proper funding of public services, nationalisation of the railways and water industry, and people as the priority rather than business and the City. The title captured the spirit – For the Many Not the Few. Or, to put in another way, After Neoliberalism. The vision is not yet the answer to the latter question, but it represents the beginnings of an answer.

Ever since the late 1970s, Labour has been on the defensive, struggling to deal with a world where the right has been hegemonic. We can now begin to glimpse a different possibility, one in which the left can begin to take ownership – at least in some degree – of a new, post-neoliberal political settlement. But we should not underestimate the enormous problems that lie in wait. The relative economic prospects for the country are far worse than they have been at any time since 1945. As we saw in the Brexit vote, the forces of conservatism, nativism, racism and imperial nostalgia remain hugely powerful. Not only has the country rejected continued membership of the European Union, but, along with the rest of the West, it is far from reconciled with the new world that is in the process of being created before our very eyes, in which the developing world will be paramount and in which China will be the global leader.

Nonetheless, to be able to entertain a sense of optimism about our own country is a novel experience after 30 years of being out in the cold. No wonder so many are feeling energised again.

This article first appeared in the 15 June 2017 issue of the New Statesman, Corbyn: revenge of the rebel

Martin Jacques is the former editor of Marxism Today. 

This article first appeared in the 15 June 2017 issue of the New Statesman, Corbyn: revenge of the rebel

0800 7318496