Is Labour abolishing illness?

The new rules on incapacity benefit stake everything on a major gamble: that a large proportion of c

Incapacity benefit has become one of this year's favourite scare stories. Hardly a day passes without a new headline deploring its soaring costs and the rising numbers of claimants who get "something for nothing", at the expense of decent, hardworking taxpayers. We are told that we are footing an outrageously escalating bill for 2.4 million people, a million of whom shouldn't be on the benefit at all, and each successive work and pensions minister vows to be more ruthless than the last.

The true picture is somewhat different. The unreported version, which can be culled from Department for Work and Pensions (DWP) data, is that only 1.4 of the 2.4 million actually receive any payment, the rest get national insurance credits only, and numbers have been falling since 2003. The basic benefit is worth barely £3,000 a year. After two small rises in the first year there is no further increase, other than index-linking. All those who get the benefit have to pass a rigorous "personal capability assessment" (PCA) with doctors appointed by the DWP; and they can be re-examined at any time. The audited estimate of fraud is under 1 per cent - the lowest of any part of the social security system.

Nonetheless, the 2007 Welfare Reform Act is now being implemented across the country. It replaces support, as of right, for illness/disability (one of the planks of our rapidly disappearing welfare state) with a new, conditional employment and support allowance. Claimants are held on a basic allowance until it is confirmed that their capability for work is limited. This is determined by a "work capability assessment" tougher than the old PCA. Those deemed capable of one day returning to work (and the arbiters are health professionals rather than doctors) must engage in a series of "work-focused" interviews and activities. These include, among other things, "condition management", which in practice is likely to consist of group sessions loosely based on cognitive behavioural therapy. All this brings an additional slice of benefit that can, however, be cut for those who do not engage in it without "good cause" - a potential loss of 40 per cent of income. Ultimately, any whose capability for work remains limited through failing to follow medical advice, or "any prescribed rules of behaviour", face a period of disqualification. (A further provision of the act, to be piloted in nine areas, is that people served with Asbos - antisocial behaviour orders - can face cuts in their housing benefit for refusing local authority offers "to help address any problem behaviour".)

A main selling point of the reform was the great savings it would bring. It would staunch the outflow of benefits and get many people into jobs where they would pay tax and provide for their old age. This government's cherished goal is an employment rate of 80 per cent of the working-age population - though it is difficult to find any reasoned argument in support of this since our present rate of 75 per cent is, with Canada's, the highest in the world. The government accepts that employers must be paid to take on people with an illness record and, for the time being, it has pledged not to cut the benefits of existing claimants. Any immediate savings, therefore, can only come from bumping as many as possible off the benefit, shaving future benefit levels (already well in hand), and making it harder for newcomers to get it in the first place. Delivery is being farmed out to private agencies paid by results - which means, of course, the setting of targets. The next few years will be a bad time to have a crippling accident or succumb to a serious disease, particularly a psychiatric or neurological one that does not have obvious outward symptoms.

Blaming the "cheats"

The reform of incapacity benefit has been over ten years in the making, leaving in its wake a dense trail of commissioned reports. A curious thing about this voluminous material is how little information it contains on the actual health conditions for which benefit is paid. This is no accident, for the reformers long ago made up their minds that claimant numbers are too high, therefore a large proportion - usually put between a third and a half, but lately upped to 70 per cent in some quarters - must be spurious. An appeal to history is repeated like a mantra that, back in 1979, only 700,000 claimed the old sickness/invalidity benefits. Since then, money has been poured into the NHS while health care, living standards and longevity have improved beyond all expectations. People must be healthier, which proves that huge numbers are exploiting a slack and obsolete system. Who is to blame, apart from outright cheats? It can only be the self-indulgent, who fancy themselves sicker than they really are, and complacent GPs who let them think they are too ill to work.

Crucially, the reformers bracketed illness with disability. The disability lobby had long argued that "disability" was a discriminatory label imposed by society, and it was bent on removing the barriers to work that excluded those so labelled and kept them in poverty. But the bracketing brought confusions - for those with disabilities may be extremely fit (consider the disabled athlete), whereas the able-bodied can be extremely ill. More confusion arises with conditions such as "stress", "anxiety" and "chronic fatigue" that sound trivial. As for "back pain", how unreasonable is it to take time off sick for something best dealt with by a stiff upper lip and the odd aspirin? It is easy for those in good health to pooh-pooh such things, agreeing with the government that "Work is the best therapy".

The government's declared mission is to "liberate" claimants, to bring them into its "reformed, coherent welfare state for the 21st century". It seeks to overturn a culture based on the "medical model" of illness that allows them to "drift" on to long-term benefits without realising that "symptoms, feeling unwell, sickness and incapacity are not the same" - hence the appeal of cognitive behavioural therapy, which it understands as a treatment that will talk the sick into believing they can lead normal lives.

Doctors - so often the refuge of desperate people trying to find out what is wrong with them - should as far as possible be excluded from the process. Even those working for the DWP have opinions that are "unfounded, of limited value and counter-productive", while GPs are "unaware of the importance of work, the absence of which leads to depression, poor health, higher rates of suicide and mortality, poverty, and social exclusion". (The quotations are from a 2005 study from the Unum Provident Centre for Psychosocial and Disability Research at Cardiff University, whose ideas and rhetoric infuse the reform. Unum Provident is an American firm, the largest disability insurance company in the world, which is currently in litigation in different countries for refusing to pay out on some of its policies.) A private agency has now taken over the running of its first GP surgery here, and doctors dealing with disability living allowance are advised not to invite patients to explain how their condition affects them.

Features of the reform are familiar from other policy areas. First, a demonisation of a needy or vulnerable group, followed by a rebranding: so claimants become not even "clients" but "customers" (as in the just published "Commissioning Strategy" document); incapacity benefit becomes employment and support allowance; sick notes are redrafted for doctors to certify, not what patients can't but what they can do. Next come "partnerships", on an unchallenged assumption that the public sector has failed. The new system is farmed out to for-profit or non-profit-making agencies paid by results. This entails targets, and where targets are set, sanctions follow, for any who "fail to recover".

There are features of the new programme that look intelligent and humane, doubtless owing much to the efforts of the disability lobby. They include a longer and more flexible bridging period (and a back-to-work grant) between benefits and work, and a broader view of "work- focused" activities. The crunch will come with those described as not able or prepared to engage "because [of] the nature and severity of their health condition, or more a matter of attitudes, perceptions and expectations which may or may not be accurate . . . It is a question of what the claimant cannot do vs what they will not do."

For the reform stakes everything on a gamble: that a large proportion of claimants, present and to come, are fit enough to work. There seems no way of proving or disproving this, other than trying it out, at the risk of much waste of public money, and much personal grief. Deliberate rejection of the "medical model" deprives us of all we might have learned (from the wealth of data available) of the impact of illness on our society.

I have scratched my head long and hard over this reform (among other things sending out lengthy submissions to all concerned during the long consultation phase in 2005-2006) because so much in its theory and rhetoric contradicts my own experience: of chronically and seriously ill family members and friends, of several years as a Mind volunteer, and further years of peripheral involvement in action groups for chronic fatigue conditions. All this has indelibly impressed me with the courage of many who live with horrible complaints, the sheer hard work involved in their day-to-day coping, their relentless search for any amelioration, let alone cure, often at costs hard to spare from limited resources.

I have witnessed, too, and at close quarters, the hurt and stress of living difficult lives as people have to do, in a perpetual culture of disbelief and threat, where some of the most valiant are blamed for their conditions and conflated with the alleged "can't work, won't work" unemployed. For the message of the reform that comes across, for all its fashionable rhetoric, is that a person is valued only as a productive unit. Compassionate cases aside, those too ill to work are outside society and money spent on them is wasted. Sickness, disablement and inability to work have no place in a modern society - they can't and shouldn't be afforded.

No one pretends that illness is not a blight, imposing personal and social costs going far beyond the financial; but - pace the government - no one as yet knows how to remove it from the human condition. Why waste valuable time and resources on an ill-founded reform, when they could instead be used to further understanding of the real impact of illness on our society?

Alison Ravetz is a professor emeritus of Leeds Metropolitan University who writes on housing policy and welfare reform

This article first appeared in the 05 May 2008 issue of the New Statesman, High-street robbery

Martin O’Neill
Show Hide image

How nostalgic literature became an agent in American racism

From To Kill a Mockingbird to Gone With The Wind, literary mythmaking has long veiled the ugly truth of the American South.

Alongside this summer’s debate over the meanings of the Confederate flag, sparked by the Charleston shootings, another story about the history of American racism flared up. Harper Lee’s Go Set a Watchman, an early draft of To Kill a Mockingbird, revealed further discomfiting truths, this time about the background to one of the nation’s most beloved fictional standard-bearers for racial equality.

In Watchman, Atticus Finch is discovered, twenty years after the action of Mockingbird, fighting against desegregation during the civil rights era. The shock readers have felt is akin to finding an early draft of Nineteen Eighty-Four in which Orwell defends surveillance in the name of national security – but stumbling upon a segregationist Atticus Finch should not surprise readers who know their history. In the end, he proves just as problematic a symbol as the Confederate flag, and for exactly the same reasons: both flew in the face of civil rights to defend white prerogative in the South. And both were gradually romanticised through the process of revision, wrapping America in comforting white lies.

Like our euphemistic habit of calling Jim Crow laws “segregation” rather than apartheid, all of this partakes in a time-honoured communal practice of deracination through whitewashing, through popular fictions of innocence such as Mockingbird and ­popular histories such as folk tales about the innocence of the Confederate flag. America keeps willing its own innocence back into being, even as racist ghosts continue to haunt our nation’s dreams of the pastoral South, a society whose fantasy of gentility was purchased at the cost of brutal chattel slavery and its malignant repercussions.

An essential aspect of the history of American racism is also the history of its deliberate mystification. Many Americans continue to accept an idealised view of the antebellum and Jim Crow South, which emerged as part of a national romance known as the Lost Cause, a legend that turned terrorism into a lullaby. In the aftermath of the South’s crushing defeat in the civil war, southerners began trying to reclaim what they had lost – namely, an old social order of unquestioned racial and economic hierarchies. They told stories idealising the nobility of their cause against the so-called War of Northern Aggression, in which northerners had invaded the peaceful South out of a combination of greed, arrogance, ignorance and spite. Gentle southerners heroically rallied to protect their way of life, with loyal slaves cheering them all the way. This Edenic dream of a lost agrarian paradise, in which virtuous aristocrats and hard-working farmers coexisted peacefully with devoted slaves, pre-dated the war: merging with Jeffersonian ideals of the yeoman farmer, it formed the earliest propagandistic defences of slavery. One of the most powerful broadsides against this spurious fantasy was launched in 1852 by Harriet Beecher Stowe, a north-eastern preacher’s daughter who had lived in border states and seen what really happened to ­fugitive slaves. The South responded to Uncle Tom’s Cabin with a furious denunciation of Stowe’s knowledge and motives, insisting on the benevolence of their “peculiar institution” – an especially sordid ­euphemism that recast plantation slavery as an endearing regional quirk.

Once institutional slavery was irrevocably destroyed, nostalgia prevailed, and southerners began producing novels, ­poems and songs romanticising the lost, halcyon days of the antebellum era. Reaching their apotheosis in Margaret Mitchell’s Gone with the Wind in 1936, and its film version in 1939, they have become known as the “moonlight and magnolia” school of plantation fiction, exemplified by the novels of Thomas Nelson Page, whose books, such as In Ole Virginia (1887) and Red Rock (1898), helped establish the formula: devoted former slaves recount (in dialect) their memories of an idyllic plantation culture wantonly destroyed by militant northern abolitionists and power-crazed federalists. A small band of honourable soldiers fought bravely on the battlefield and lost; the vindictive North installed incompetent or corrupt black people to subjugate the innocent whites; southern scalawags and northern carpetbaggers descended to exploit battle-ravaged towns. Pushed to the limits of forbearance, the Confederate army rose again to defend honour and decency – in the noble form of the Ku Klux Klan.

This thoroughly specious story is familiar to anyone who knows Gone with the Wind, but Mitchell learned it from Page and other novelists, including Mary Johnston and Thomas Dixon, Jr. Dixon wrote a series of books celebrating the rise of the Ku Klux Klan, of which the most notorious remains The Clansman (1905), for the simple reason that ten years later a director named D W Griffith decided to adapt it into a film he called The Birth of a Nation. Released exactly a century ago, it faithfully follows the same narrative blueprint. This was neither an accident nor a coincidence: Griffith was himself the son of a Confederate colonel (memorably known as “Roaring Jake”) and had listened to legends of the Lost Cause at his father’s knee. Reading The Clansman, Griffith said, brought back “all that my father had told me . . . [the film] had all the deep incisive emotionalism of the highest patriotic sentiment . . . I felt driven to tell the story – the truth about the South, touched by its eternal romance which I had learned to know so well.” That “truth” was a story in which the cavalry that rode to the rescue of an imperilled maiden was the KKK, saving her from a fate worse than death at the hands of a white actor in blackface. Griffith had considered hiring black actors for the film, he told an interviewer in 1916, but after “careful weighing of every detail concerned”, none of which he imparted, “the decision was to have no black blood among the principals”, a phrase that says it all by using the so-called one-drop rule to justify racist hiring practices in a film glorifying racism.

The Birth of a Nation was a national phenomenon, becoming the first film ever screened at the White House, in March 1915, for President Woodrow Wilson. Born in Virginia, raised in Georgia and South Carolina, Wilson was the first southern president since before the civil war. He attended Johns Hopkins University with Thomas Dixon, who became a political supporter of his; Wilson also appointed Thomas Nelson Page his US ambassador to Italy. Many members of his administration were equally avowed segregationists; their influence reverberated for generations. Wilson’s history books were quoted verbatim by Griffith in some of The Birth of a Nation’s titles, including: “‘The white men were roused by a mere instinct of self-preservation . . . until at last there had sprung into existence a great Ku Klux Klan, a veritable empire of the South, to protect the Southern country.’ – Woodrow Wilson”.

Wilson does not, however, actually seem to have said the quotation most frequently attributed to him, that The Birth of a Nation was “like writing history with lightning, and my only regret is that it is all so terribly true”. The source for this appears to be Griffith, who told a 1916 movie magazine: “The motion picture can impress upon a people as much of the truth of history in an evening as many months of study will accomplish. As one eminent divine said of [moving] pictures, ‘they teach history by lightning!’” They can teach myths the same way, as Griffith’s film regrettably proved. Northern white audiences cheered it; black audiences wept at the malevolence it celebrated; it prompted riots and racist vigilante mobs in cities across America, and at least one racially motivated murder. Together, Griffith and Dixon worked to elevate a regional legend, designed to save face after a catastrophic loss, into a symbolic myth accepted by much of the nation.

 

***

The Birth of a Nation helped to invent the feature film. It also helped to reinvent the Ku Klux Klan, which enjoyed a huge resurgence following the film’s release, spreading up through the north-east and Midwest. The Klan had a strong presence on Long Island in the 1920s, for example, which is why F Scott Fitzgerald made Tom Buchanan a believer in “scientific racism” in The Great Gatsby, a story set there in 1922, the same year in which Thomas Nelson Page died while working on a novel about the Klan called The Red Riders.

Four years later, a young writer named Margaret Mitchell began her own tale of the Lost Cause, a story that follows the same pattern, to the extent of describing Scarlett O’Hara on its first page in coded terms as a woman with “magnolia-white skin”. Gone with the Wind became a global sensation when it was published in 1936. Mitchell’s ideas of southern history were deeply influenced by Dixon’s racist ideology, although she was considerably more knowledgeable about the workings of plantation slavery than is often recognised. In fact, the great majority of antebellum southern planters (some 75 per cent) were what historians class as yeoman farmers (growing primarily for domestic use, selling only a small portion of their crop), not aristocratic owners of vast plantations. Such farmers often did not even own a slave, but rented or borrowed one or two during harvest time. But although the Hollywood version dispensed entirely with Mitchell’s sense of history (and her considerably more interesting take on historical sexism), it retained her racism, despite the stated intentions of its producers. The film of Gone with the Wind contributed greatly to the gradual displacement and dislocation of the historically specific time and place that had given rise to the antebellum plantation myth. Ben Hecht’s famed floating preface to the 1939 big-screen version labours at making the story timeless, a legend dissipating in the mists of history:

There was a land of Cavaliers and Cotton Fields called the Old South . . . Here in this pretty world Gallantry took its last bow . . . Here was the last ever to be seen of Knights and their Ladies Fair, of Master and of Slave . . . Look for it only in books, for it is no more than a dream remembered. A Civilisation gone with the wind . . .

as history recedes into ellipses.

Yet even this feudalist imagery has a specific history of its own, forming another crucial part of the popular culture of the Lost Cause. It was Mark Twain, whose works remain one of our most powerful literary antidotes to the toxin of moonlight and magnolia, who most famously named its source: the novels of Sir Walter Scott. Scott “did measureless harm”, Twain insisted in his 1883 Life on the Mississippi, “more real and lasting harm, perhaps, than any other individual that ever wrote”. Scott’s bestselling romances filled southerners’ heads with enchanted “dreams and phantoms . . . with the sillinesses and emptinesses, sham grandeurs, sham gauds and sham chivalries of a brainless and worthless long-vanished society”. In fact, Twain concluded acidly, “Sir Walter had so large a hand in making Southern character, as it existed before the war, that he is in great measure responsible for the war.” This is only slightly exaggerated: Scott’s novels were as popular in early-19th-century America as were the films of The Birth of a Nation and Gone with the Wind a century later. It is certainly thanks to the popularity of novels such as Ivanhoe and Waverley that the cult of the “clan” was distorted into the Ku Klux Klan, with its bogus orders and knights. Scott’s novels may even have given rise to antebellum America’s use of the medieval word “minstrel” to describe itinerant blackface musicians; the OED offers no logic for the coinage, but its earliest citation is from 1833, when the mania for Scott’s work in the American South was at its highest. The feudal fantasy of “gallantry”, “cavalier knights” and “ladies fair” is a powerful one, not least, doubtless, because it gives America a sense of a much older past than it has, merging our history with a faux-feudalism in which slaves are rewritten as serfs, bound by devotion to the land and the family they serve.

***

But even as Gone with the Wind was in its ascendancy, another national narrative was slowly gaining traction. Black writers including W E B DuBois, Zora Neale Hurston, Langston Hughes and Richard Wright were vigorously challenging the dominant racist narrative, while white southern writers such as Ellen Glasgow and, especially, William Faulkner, also began debunking and complicating this facile tradition. Faulkner took from the Lost Cause legends that he, too, had heard as a child in Mississippi, some much darker truths about memory, history, distortion, perspective. Faulkner’s Absalom, Absalom! is probably his greatest meditation on the processes of myth-making and how they intersect with national history, writing a Homeric epic of America. Absalom came out in 1936, the same year as Gone with the Wind, and in many ways can be read as a corrective to it, shaking off the dreams of moonlight and magnolia to show the Gothic nightmare underneath.

Just twenty years later, a young woman who had been born in the year Margaret Mitchell began to write Gone with the Wind, named Nelle Harper Lee, produced her own version of the story of America’s struggles with its racist past. To Kill a Mockingbird takes place between 1933 and 1935 (just before Gone with the Wind was published), during which time Atticus Finch tells his young daughter, Scout, that the Ku Klux Klan was “a political organisation” that existed “way back about 1920” but “couldn’t find anybody to scare”, relegating the Klan to ancient history, instead of a mere dozen years before Lee’s story.

Nor is it any coincidence that Mrs Dubose, the racist old lady addicted to morphine, forces Jem Finch to read Walter Scott’s Ivanhoe to her as a punishment: there are more encoded traces of Lost Cause history in Mockingbird than many readers have recognised. Mockingbird resists this history on balance, but it also participates in collective myth-making, most significantly in its depiction of lynching as a furtive, isolated practice in the dead of night. It would be pretty to think so. In Lee’s version, a lynch mob is dispersed by the innocent prattle of the child Scout, who unwittingly reminds its members of their basic “decency”. In fact, by the 1920s, lynching in the South was often publicly marketed as a tourist attraction, in a practice historians now refer to as “spectacle lynching”, which took place in the cold light of day, with plenty of advance warning so people could travel from outlying areas for the fun. There were billboards and advertisements letting tourists know when and where the lynching would take place; families brought children and had picnics; postcards were sold and sent (horrific images of which are easy to find online). Burning at the stake was common; victims were often tortured or castrated, or had limbs amputated, or were otherwise mutilated first; pregnant women were burned to death in front of popcorn-crunching crowds. This was so well established that in 1922 the National Association for the Advancement of Colored People took out a full-page advertisement in the New York Times with the headline “The shame of America”, demanding, in underscored outrage: “Do you know that the United States is the Only Land on Earth where human beings are BURNED AT THE STAKE?” It went on to outline a few salient facts: between 1889 and 1922, 3,436 people had been lynched in America. Of these, “only 571, or less than 17 per cent, were even accused of rape”. Eighty-three of the victims were women, which further undermined the myth that rape led to lynching.

Lee’s picture of lynching in To Kill a Mockingbird is not merely sentimental, but an active falsification: it participates in a wider American story seeking to minimise, even exonerate, the reactions of white southerners to African Americans’ claims to equal rights under the law. It functions as a covert apology, suggesting that lynching was anomalous, perpetrated by well-meaning men who could be reminded of common humanity – when, in fact, members of lynch mobs in the 1920s posed for photographs in front of their victims. But Atticus Finch stands up to racism, and Scout persuades Cunningham and the rest of the lynch mob to go home. This is why it matters that an earlier version of Atticus advocated segregation: it shows that the process of revising Mockingbird was a process of idealisation, promising readers that systemic racism could be solved by the compassionate actions of noble individuals.

This is the consolatory promise of individualism, that the nation can be redeemed collectively by isolated instances of benign action. The romance of individualism has always been how America manages injustice and unrest, how it pastes over irre­concilable differences and squares imaginary circles: the heroic figure who temporarily, occasionally, overcomes all structural and social impediments is taken as communal evidence that these obstructions don’t exist, or aren’t very obstructive. It is, of course, a deeply Christian idea: the redemptive individual who expiates collective sin, the original lost cause, sacrificed for the good of all.

The doctrine of individualism is further tangled up in yet another exculpatory logic: that of states’ rights. The remnants of this rationale also surface at the end of Go Set a Watchman: Atticus Finch argues that segregation is really a matter of states’ rights, and Jean Louise, his now adult daughter, though nauseated at his racism, accepts the legitimacy of that argument. What right has the federal government to insist that the people within its borders adhere to its laws, or even to the principles (truth, justice, equality, democracy) they purport to uphold? Some might conclude from reading this passage that, like paradise, causes are invented to be lost. The idea of the Lost Cause redeems a squalid past; it is an act of purely revisionist history, disavowing the notion that slavery or racism had anything to do with the civil war and its vicious, lingering aftermath. Glorifying that history as “gallantry” is not merely dishonest: it is ruinous.

Dismissing all of this as ancient history, or mere fiction, is not the solution, it is part of the problem. As Faulkner famously observed and this brief account shows, the past isn’t dead; it isn’t even past. The word “legend” comes from the Latin legere, to read. Many insist that the stories we consume endlessly are harmless entertainment, but that, too, is part of the mystification, pretending that these legends did not arise precisely to veil an ugly truth with moonlight, to smother with the scent of magnolia the stink of old, decomposing lies.

Now listen to Sarah Churchwell discussing the fiction of the American South with the NS's Tom Gatti:

An American in London, Sarah Churchwell is an author and professor of American literature. Her latest book, “Careless People: Murder, Mayhem and the Invention of the Great Gatsby”, is published by Virago.

This article first appeared in the 20 August 2015 issue of the New Statesman, Corbyn wars