A bitter pill to swallow

The sketchy evidence for the effectiveness of homoeopathic medicine has no scientific basis, and pos

There was an outcry in September when we learned that children in Scotland were being given a homoeopathic "MMR vaccine", a product that offered no protection against the serious dangers posed by measles, mumps and, for pregnant women, rubella. This had echoes of the discovery a few years ago by Sense About Science, Simon Singh and Newsnight that some pharmacists were offering homoeopathic pills for protection against malaria to people travelling to Central Africa. Such practices may be disturbing, but they occur because we tend to think there is no harm in indulging the clamour to maintain the alternative health market.

Reading the 11 October issue of the New Statesman, I was shocked by an advertisement in the accompanying supplement, "Social Care: Who Pays?", referring to me and my work. Rarely had I seen an advert so inaccurate and borderline libellous in a respected publi­cation. The advert, which appeared to breach the British Code of Advertising, was by a lobby group called Homeopathy: Medicine for the 21st Century (H:MC21). It contained unjustified attacks on myself and colleagues, including statements that gave a dangerously false impression of homoeopathy's therapeutic value.

As the advert questioned my own competence, I should address this first. I started my medical career in a homoeopathic hospital, where I was trained in homoeopathy for several months. Many years later, it became my job to apply science to this field and I felt I had a duty to keep an open mind - open but not uncritical.

A critical mind would notice that the two basic principles of homoeopathy fly in the face of science, logic and common sense. The first assumption is that "like cures like". For instance, if onions make my eyes and nose water, homoeopathic remedies derived from onions can be used to treat my patients' hay fever, which sometimes causes runny eyes and noses. The second assumption proposes that diluting remedies homoeopathically makes them not less but more potent, even if the final preparation no longer contains a single molecule of any active substance. These theories are not based on anything that remotely resembles fact. Like does not cure like, and endlessly diluting remedies certainly does not render them stronger, but weaker. But is there some entirely new energy to be discovered that we do not yet comprehend? Not understanding homoeopathy does not necessarily mean that it is useless.

The best way to find out is to determine whether homoeopathic remedies behave differently from placebos when patients use them. In other words, we need clinical trials.

Data gap

About 150 such studies (mostly conducted by homoeopaths) and well over a dozen syntheses of this research are available. Their results are sobering: the totality of the most reliable evidence fails to show that homoeopathic remedies work better than placebos. So, after about 200 years of research, there is no good data to convince non-homoeopaths that homoeopa­thic remedies are any different from pure sugar pills. Pro-homoeopathic lobby groups such as the one that placed the advertisement therefore have to employ propaganda to try to convince consumers who may not know better. This is perhaps understandable, but surely not right.

What of patients' experience, some might ask. Thousands of people across the world swear by homoeopathy. Are they all deluded? Clearly not. People undoubtedly do get better after seeing a homoeopath. There are many observational studies to show that this is true. Homoeopaths therefore keep telling us that their treatments work, regardless of the implausibility of homoeopathy's principles and the largely negative trial evidence.

When we rationally analyse this apparent contradiction of evidence versus experience, it quickly dissolves into thin air. The empathic encounter with a homoeopath is just one of many factors that provide ample explanation for the observation that patients can improve even when they receive placebos. A case in point is Bristol Homoeopathic Hospital's 2005 study, cited in the offending advert. The 6,500 chronically ill patients might have im­proved because of the concomitant use of conventional treatments, or because of the attention they experienced, or because of their own expectation to improve, or because the disease process had come to an end. In fact, they might have improved not because of, but despite, the homoeopathic remedies they were given.

Still, some people ask what is wrong with using placebos as long as they help patients feel better. The answer is that it prevents clinicians telling the truth to patients. Being honest would defeat any placebo effect: if I tell my patient, "Take this remedy; it contains nothing and the trial data shows nothing," she is unlikely to experience a placebo response. Hence, homoeopaths, knowingly or unknowingly, deprive patients of informed consent. This paternalistic approach is recognised as unethical. Also, placebo effects are unreliable and normally short-lived; they happen occasionally but often do not. Even if placebo responses are generated, they are usually small - certainly too small to compete with effective therapies.

Twin-track effect

Endorsing homoeopathic placebos would mean that people might use them for serious, treatable conditions. In such circumstances, homoeopathy can even cause (and has caused) the death of patients. Furthermore, if we allow the homoeopathic industry to sell placebos, we must do the same for "Big Pharma". Imagine a world where pharmaceutical companies could sell us placebos for all sorts of conditions just because some patients experience benefits through a placebo response.

Crucially, and paradoxically, we don't need placebos to generate placebo effects. If I, for instance, prescribe an antihistamine for a patient suffering from hay fever, with empathy, time and understanding, that patient benefits from a placebo effect as well as the pharmacological action of the antihistamine. If, by contrast, I prescribe a homoeopathic remedy, I deprive her of the latter, crucial benefit. It is difficult to argue, as most homoeopaths try to, that this approach would be in the interest of my patient.

What follows is straightforward: there is no good evidence that homoeopathy does more good than harm. This is not just my conclusion after 17 years of researching the subject, but a fact based on the best available evidence, which is supported by virtually all experts who are not homoeopaths. The recent decision by the coalition government to continue homoeopathy on the NHS is thus puzzling, to say the least.
The advertisement that prompted this article is misleading about the work of experts which has conclusively shown that homoeopathy can have no place in evidence-based medicine. It is an insult to our intelligence.

Edzard Ernst is professor of complementary medicine at the Peninsula Medical School, University of Exeter, and co-author, with Simon Singh, of "Trick or Treatment? Alternative Medicine on Trial" (Corgi, £8.99)

Here comes the non-science

Homoeopathy was developed in 1796 by the German physician Samuel Hahnemann. He based his treatments on the twin ideas that "like cures like" and "less is more". The latter notion was implemented by taking a substance and diluting it over and over again, so that the final product generally contains not a single molecule of the original active ingredient.

Homoeopaths accept that most of their remedies are devoid of pharmacologically active principles, but they argue that the pills contain a "memory" of the original ingredient. The memory is supposedly imprinted in the diluting agent, which is used to moisten sugar pills.

Although homoeopathy defies the laws of physics, chemistry, biology and therapeutics, there have been numerous attempts to test its impact on patients through clinical trials. In 2005, Aijing Shang and seven colleagues from the University of Berne published an analysis of the best trials in the Lancet.

Their findings confirmed many other such published assessments. Commenting on the paper, they wrote: "This finding is compatible with the notion that the clinical effects of homoeopathy are placebo effects." An accompanying editorial entitled "The end of homoeopathy" said: "Doctors need to be bold and honest with their patients about homoeopathy's lack of benefit."

This article first appeared in the 08 November 2010 issue of the New Statesman, Israel divided

Show Hide image

The English question

The political community that is England is neither stable nor settled. But something is stirring among Chesterton’s secret people.

From the late 18th century to the early 20th, Britain’s political class wrestled with an Irish Question: how could the British state govern “John Bull’s Other Island” in a way that kept the native Irish quiescent, without jeopardising its own security? When Ireland was partitioned in 1921 the question disappeared from the British political agenda – only to reappear in another guise during the Troubles in Northern Ireland half a century later. It was not laid to rest until the Belfast Agreement of 1998. More recently, politicians and commentators on both sides of the border have had to come to terms with an increasingly intractable Scottish Question: how should the ancient and once independent Scottish nation relate to the other nations of the United Kingdom and to the Westminster parliament? As the convoluted debate provoked by the coming EU referendum shows, a more nebulous English Question now looms in the wings.

Like the Irish and Scottish Questions, it is the child of a complex history. England became a united kingdom in Anglo-Saxon times. It faced external enemies, notably invading Danes, but its kings ruled their own territory with an iron hand. The Norman Conquest substituted francophone rulers and a francophone nobility for these Anglo-Saxon kings; the new elite spoke French, sent their sons to France to be educated and polished and, in many cases, owned territory in France. Simon de Montfort, once credited with founding the English parliament, was a French nobleman as well as an English one. But the kingdom remained united. The Celtic people who had once inhabited what is now England were driven out by the Anglo-Saxons; Lloegr, the Welsh word for England, means “the lost land”. It stayed lost after the Conquest; and indeed, the Norman rulers of England pushed further into Wales than their Anglo-Saxon predecessors had done.

United did not mean peaceful or stable. Henry II, William the Conqueror’s great-grandson, ruled a vast Continental empire stretching from the English Channel to the Pyrenees, as well as England. Inept kings, uppity barons, an aggressive church, restive peasants, a century-long war with France and bitter dynastic rivalries undermined his achievement. But there was no English equivalent to the powerful, de facto independent duchies of Burgundy or Aquitaine in what is now France, or to the medley of principalities, city states and bishoprics that divided Germans and Italians from each other until well into the 19th century. That was still true after the Welshman Henry Tudor defeated Richard III at the Battle of Bosworth in 1485 and seized the English crown as Henry VII. His son (who became Henry VIII) was not content with keeping England united. Having broken with the Catholic Church when the Pope refused to annul his first marriage, he made himself head of the Church in England and proclaimed that the realm of England was an “empire”, free from all external authority.

From the upheavals of Henry’s reign and the subtle compromises of his daughter Elizabeth’s emerged the Church of England – an institutional and theological third way between the Catholicism of Rome, on the one hand, and the Protestantism of John Calvin’s Geneva and Martin Luther’s Germany on the other. The Church of England has spoken to and for the English people ever since. Sometimes it has spoken feebly and complacently, as in the 18th century. At other times it has been outspoken and brave, as in the Second World War, when William Temple was the archbishop of Canterbury, and during the 1980s, when a Church of England commission excoriated the Thatcher era’s “crude exaltation” of “individual self-interest”. Despite (or perhaps because of) the subtle compromises embodied in it, the Anglican Church has been prone to schism. “High Church” Anglicans have stressed its Catholic inheritance; followers of the “low” Church have insisted on its Protestantism. Two charismatic High Anglican priests – John Henry Newman and Henry Edward Manning – converted to Catholicism and ended as cardinals.

Yet these schisms did not affect the laity or diminish the Church’s role in English life. From the end of the English civil wars in 1660 to the late 19th century, England was ruled by the Anglican landed class, the most relaxed and confident governing class in Europe. A bien-pensant, easygoing and undogmatic latitudinarianism shaped relations between church and state. Doctrinal precision was tiresome, even a little vulgar. Wherever possible, differences were fudged: the very Thirty-Nine Articles of the Anglican Church are a fudge. There were exceptions. Gladstone’s restless, sometimes tormented religiosity and baffling combination of high ideals with low cunning could hardly have been less easygoing. And as the 19th century wore on, Protestant dissenters, Catholics and even Jews and unbelievers were slowly incorporated into the political nation. Joseph Chamberlain, who did more to make the political weather than any other leader in the late 19th and early 20th centuries, and contrived to split both the Liberal and the Conservative parties, was a Unitarian, contemptuous of fudge.

However, the style and mood of English governance were still quintessentially Anglican. Fudge prevailed. Trollope’s political novels are a hymn to fudging. Disraeli, ethnically Jewish, though baptised into the Church of England, was a fudger to his fingertips. In his low-cunning moods, even Gladstone was not above fudging. After the Act of Union between England and Scotland in 1707 the monarchy itself rested on a mountain of fudge: the monarch was an Anglican in England, but a Presbyterian in Scotland. The English and Scottish parliaments were merged into a British parliament, but because England was far more populous and far richer than Scotland, it was the English parliament writ large, and embodied English constitutional doctrine. Equally, the Scots became junior partners in a new British empire, ultimately controlled by the Anglican elite. It won the race for empire against France, but the stiff-necked, pernickety legalism of successive London governments drove its colonies on the seaboard of what is now the United States into revolt and eventual independence.

The Anglican elite learned their lesson. Thereafter, imperial governance was English governance writ large. From an early stage the colonies of settlement, later known as the “white dominions”, were, in effect, self-governing. At first sight, India, “the brightest jewel in the British crown”, was an exception. It was acquired by force and maintained, in the last resort, by force. The Great Rebellion of 1857, once known as the Indian Mutiny, was brutally suppressed. In the Amritsar Massacre of 1919, Brigadier General Dyer ordered his troops to fire on an unarmed and peaceful crowd; they went on firing until their ammunition was exhausted. But the most astonishing feature of the British Raj is that a tiny sliver of British soldiers and administrators somehow managed to govern a subcontinent populated by roughly 250 million subjects. Force alone could not have done this. The Raj depended on indirect rule, on adroit accommodation to local pressures. It would not have survived without the collaboration of Indian elites, and the price of collaboration was a willingness to temper the wind of imperial power to the shorn lamb of Indian hopes and fears.

***

 

The Anglo-British story echoed the Indian story. The political, administrative and financial elites in Westminster, Whitehall and the City of London viewed the kingdom they presided over through an Indian lens. British subjects in the mother country were treated like Indian subjects in the Raj. Force lurked in the background, but most of the time it stayed in the background. The Peterloo Massacre of 1819, in which mounted cavalry charged into a crowd of as many as 80,000 people demonstrating for greater parliamentary representation at St Peter’s Field in Manchester, was a paler precursor of the Amritsar Massacre; the Rhondda township of Tonypandy, where hussars helped crush a “riot” by striking miners in 1910, lived on in the folk memory of the labour movement for decades. Yet these were exceptions, just as Amritsar was an exception.

Co-option, accommodation and collaboration between the governing elites and lesser elites beyond them were the real hallmarks of British governance. The French saying that there is more in common between two deputies, one of whom is a communist, than there is between two communists, one of whom is a deputy, also applied to Britain. In the cosy Westminster village, insurgent tribunes of the people, from the popular radical John Bright to the fulminating socialist Michael Foot, slowly morphed into grand and harmless old men. Outside the village, subjects were inescapably subjects, not citizens, just as their Indian counterparts were. Sovereignty, absolute and inalienable, belonged to the Crown-in-Parliament, not to the people. And the whole edifice was held together by layer upon layer of fudge.

Now the fudge is beginning to dissolve. The Raj disappeared long ago. The fate of steelworkers in South Wales depends on decisions by an Indian multinational whose headquarters are in Mumbai. The empire on which the sun never set is barely a memory. Unlike her great-great-grandmother Queen Victoria, the present Queen is not an empress; she has to make do with leading the Commonwealth. In law, the Crown-in-Parliament remains absolutely sovereign and the peoples of the United Kingdom are still subjects, not citizens. But legal principles and political realities diverge. The Anglo-British state whose capital is London and whose parliament stands on the fringes of the Thames is no longer the sole institution that shapes and reflects the political will of the people over whom it presides. There are now four capital cities, four legislatures, four governments and four political systems in the United Kingdom.

The devolved administrations in the non-English nations of the kingdom control swaths of public policy. The parties that lead them vary enormously in ideology and history. The Scottish National Party, which has governed Scotland for nearly nine years, stands for an independent Scotland. In Wales, Labour has been the strongest party since devolution, but it and Plaid Cymru (the “Party of Wales”) have already formed one coalition and may well form another after the elections to the Welsh Assembly next month. No great changes are likely. Almost certainly Wales will continue to be a social-democratic candle in a naughty world. Since the Belfast Agreement, Northern Ireland has been governed by a power-sharing executive, representing both the republican tradition, embodied in Sinn Fein, and the loyalist tradition, embodied in the Democratic Unionist Party. The sovereign Westminster parliament has the legal right to repeal the devolution statutes, but doing so would amount to a revolution in our uncodified constitution and would destroy the Union.

England is a stranger at the feast. It towers above the others in wealth, in population and in political clout. It has almost 84 per cent of the UK population. Scotland has just under 8.5 per cent, Wales just under 5 per cent and Northern Ireland less than 3 per cent. Yet there is no English parliament or government. In times past, English people have often treated the words “English” and “British” as synonyms, but devolution to Scottish, Welsh and Northern Irish legislatures and administrations has made a nonsense of this lazy conflation.

***

England and the English now face the primordial questions that face all self-conscious political communities: “Who are we?”, “Who do we want to be?” At bottom, these questions are philosophical, in a profound sense moral, not economic or institutional. They have to do with the intangibles of culture and sentiment, not the outward forms that clothe them. In stable and settled political communities they are rarely discussed. They don’t need to be. But the political community that is England is neither stable nor settled. Fuelled in part by resentment of the alleged unfairness of the devolution process and in part by the psychic wound left by the end of the Anglo-British empire, an inchoate, grouchy English nationalism is now a force to be reckoned with. St George’s flags flying on 23 April; the extraordinary rise of Ukip; David Cameron’s panic-stricken attempt to “renegotiate” Britain’s role in the European Union – all tell the same story: the “secret people of England”, as G K Chesterton called them, are secret no longer.

But that is not an answer to my questions. It only shows that they are urgent. At the moment, two answers hold the field. The first – the answer embodied in the Cameron government’s “Project Fear” over the UK’s membership of the EU – is essentially deracinated. For the globetrotting super-rich, the financial services sector, the Bank of England and the managers of the Union state, England consists of London and the more salubrious parts of the south-east. The answer to the English Question is that there is no such question. The notion that the English have to decide who they are and who they want to be is a backward-looking fantasy. Globalisation has overwhelmed the specificities of English culture and experience. The English buy and sell in the global marketplace and they face global threats. Membership of an EU made safe for market fundamentalism offers the best available route to security and prosperity in an ever more globalised world.

The second answer – the answer implicit in Eurosceptic rhetoric – is romantically ­archaic. At its heart is a vision of England as a sea-girt and providential nation, cut off from the European mainland by a thousand years of history and a unique constitutional arrangement. It harks back to Shakespeare’s hymn to England as a “jewel set in the silver sea”; to Henry Newbolt’s poem “Drake’s Drum”, evoking the memory of gallant English mariners driving the top-heavy galleons of the Spanish Armada up the Channel to their doom; and to Nelson dying gloriously at Trafalgar at the climax of his greatest victory. It fortified Margaret Thatcher during the nail-biting weeks of the Falklands War; it inspired Enoch Powell’s passionate depiction of post-imperial England as the reincarnation of the England of Edward the Confessor: an England whose unity was “effortless and unconstrained” and which accepted the “unlimited supremacy of Crown-in-Parliament so naturally as not to be aware of it”. As Powell saw more clearly than anyone else, this vision rules out EU membership.

No one with progressive instincts can possibly be satisfied with either of these answers. The great question is whether there is a better one. I think there is, but I can’t pretend that it is easy or comfortable. It is republican in spirit – which does not entail getting rid of the monarchy, as the many Continental monarchies show. It embodies a tradition stretching back to England’s brief but inspiring republican experiment during the civil wars of the 17th century, and before that to Renaissance Italy and Republican Rome. Central to it is the notion of “neo-Roman liberty”: of liberty as freedom from domination, from dependence on another’s will. John Milton was its most eloquent English exponent, in prose and verse, but it also inspired Tom Paine’s contempt for hereditary rule and the “foppery” that went with it. In the 20th century its most engaging champion was R H Tawney, the ethical socialist, economic historian and foe of the “religion of inequality”, its “great God Mumbo-Jumbo” and the “servile respect for wealth and social position” it inculcated.

The goal is clear: a republican England in a republican Britain and a republican Britain in a republican Europe. The obstacles are formidable. As the founders of the American republic discovered, republican liberty entails federal union, combining diversity at the base with unity at the centre; and for that there are few takers. But Gramsci was right. Pessimism of the intellect should go hand in hand with optimism of the will. There is all too much pessimism of the intellect on the British left. It is time for some optimism of the will.

David Marquand’s most recent book is “Mammon’s Kingdom: an Essay on Britain, Now” (Allen Lane)

This article first appeared in the 08 April 2016 issue of the New Statesman, The Tories at war