Show Hide image

After God: What can atheists learn from believers?

Why do religious stories continue to mean so much to so many of us, even to the self-described “new, new atheists”?

Jonathan Derbyshire writes: Jeremy Bentham, his disciple John Stuart Mill once wrote, would always ask of a proposition or belief, “Is it true?” By contrast, Bentham’s contemporary Samuel Taylor Coleridge, Mill observed, thought “What is the meaning of it?” was a much more interesting question.

Today’s New Atheists –Richard Dawkins, Sam Harris, Daniel Dennett and the late Christopher Hitchens principal among them – are the heirs of Bentham, rather than Coleridge. For them, religion – or the great monotheistic faiths, at any rate – are bundles of beliefs (about the existence of a supernatural being, the origins of the universe and so on) whose claims to truth don’t stand up to rational scrutiny. And once the falsity of those beliefs has been established, they imply, there is nothing much left to say.

The New Atheists remind one of Edward Gibbon, who said of a visit to the cathedral at Chartres: “I paused only to dart a look at the stately pile of superstition and passed on.” They glance at the stately pile of story and myth bequeathed to humanity by religion and quickly move on, pausing only to ask of the benighted millions who continue to profess one faith or another that they keep their beliefs to themselves and don’t demand that they be heard in the public square.

Lately, however, we have begun to hear from atheists or non-believers who strike a rather different, less belligerent tone. These “New, New Atheists”, to borrow the physicist Jim Al-Khalili’s phrase, are the inheritors of Coleridge. They separate their atheism from their secularism and argue that a secular state need not demand of the religious that they put their most cherished beliefs to one side when they enter public debate; only that they shouldn’t expect those beliefs to be accepted without scepticism.

They treat religious stories differently, too – as a treasure trove to be plundered, in the case of Alain de Botton, or, in the case of the self-described “after-religionist” Richard Holloway, as myths that continue to speak to the human condition.


We have too often secularised badly

Alain de Botton

There is so much talk of the god-shaped hole, it is easy to forget that the challenge of our times is not to measure it, but to try to fill it – by which I mean, to import a range of ideas and practices from religion into the secular realm. Atheists should learn to rescue some of what is beautiful, touching and wise from all that no longer seems true. What is good within the faiths belongs to all of mankind, even the most rational among us, and deserves to be reabsorbed selectively by the supernatural’s greatest enemies. Religions are intermittently too useful, effective and intelligent to be abandoned to the religious alone.

There are three elements of religion in particular that I believe we should “steal” from religion and reinvent for our times:

1. New priest

For centuries in the west, there was a figure in society who fulfilled a function that is likely to sound very odd to secular ears. The priest didn’t fulfil any material need; he was there to take care of that part of you called, rather unusually, “the soul”, by which we would understand the seat of our emotions and of our deep self.

Where have our soul-related needs gone? What are we doing with the material we used to go to a priest for? The deep self has naturally not given up its complexities and vulnerabilities simply because some scientific inaccuracies have been found in the tales of the five loaves and two fishes.

The most sophisticated response we have yet come up with is psychotherapy. It is to psychotherapists that we bring the same kind of problems as we would previously have directed at a priest: emotional confusion, loss of meaning, temptations of one kind or another and anxiety about mortality.

From a distance, psychotherapists look like they are already well settled in priestlike roles and that there is nothing further to be done or asked for. Yet there are a number of ways in which contemporary psychotherapy has failed to learn the right lessons from the priesthood and might benefit from a more direct comparison with it. For a start, therapy remains a minority activity, out of reach of most people: too expensive or simply not available. There have been laudable efforts to introduce therapy into the medical system, but progress is slow and vulnerable. The issue isn’t just economic. It is one of attitudes. Whereas Christian societies would imagine there was something wrong with you if you didn’t visit a priest, we usually assume that therapists are there solely for moments of extreme crisis – and are a sign that the visiting client might be a little unbalanced, rather than just human.

There is also, in a serious sense, an issue of branding. Therapy is hidden, unbranded, depressing in its outward appearance. The priests had far better clothes, and infinitely better architecture.

Modern psychotherapists’ understanding of how human beings work is immensely more sophisticated than that of priests. Nevertheless, religions have been expert at creating a proper role for the priest, as a person to talk to at all important moments of life, without this seeming like an unhinged minority activity. There is a long way to go before therapy fully plugs the gap opened up by the decline in the priesthood.

2. New gospels

When religious belief began to fracture in Europe in the early 19th century, the hope was that culture could replace religion as a tool to guide, humanise and console.

Claims that culture could stand in for scripture – that Middlemarch could take up the responsibilities previously handled by the Psalms, or the essays of Schopenhauer satisfy needs once catered to by Saint Augustine’s City of God – still have a way of sounding eccentric or insane in their combination of impiety and ambition.

Nevertheless, the proposition is not so much absurd as it is unfamiliar. The very qualities that the religious locate in their holy texts can often just as well be discovered in works of culture. Novels and historical narratives can adeptly impart moral instruction and edification. Great paintings do make suggestions about our requirements for happiness. Philosophy can usefully probe our anxieties and offer consolation. Literature can change our lives. Equivalents to the ethical lessons of religion lie scattered across the cultural canon.

So, why does the notion of replacing religion with culture, of living according to the lessons of literature and art as believers live according to the lessons of faith, continue to sound so peculiar to us? The fault lies with academia. Universities are entirely uninterested in training students to use culture as a repertoire of wisdom – a source that can prove of solace to us when confronted by the infinite challenges of existence, from a tyrannical employer to a fatal lesion on our liver.

We are by no means lacking in material that we might call into service to replace the holy texts; we are simply treating the material in a non-instrumental way. In other words, we are unwilling to consider secular culture religiously enough, in this sense, as a source of guidance.

3. New churches

You sometimes hear it said that art museums are our new churches. But, in practice, art museums abdicate much of their potential to function as new churches (places of consolation, meaning, community and redemption) through the way they handle the collections entrusted to them. While exposing us to objects of importance, they nevertheless seem unable to frame these in a way that links them powerfully to our inner needs.

What if modern museums of art kept in mind the example of the didactic function of Christian art? A walk through a museum of art should amount to a structured encounter with the ideas that it is easiest for us to forget but which are most essential and life-enhancing to remember. The challenge is to rewrite the agendas for our art museums so that collections can begin to serve the needs of psychology as effectively as they served those of theology, for centuries. Curators should attempt to put aside their deepseated fears of instrumentalism and once in a while co-opt works of art to an ambition of helping us to get through life. Only then would museums be able to claim that they had fulfilled completely the excellent but as yet elusive ambition of becoming substitutes for churches in a secularising society.

The challenge facing atheists is how to separate many ideas and rituals from the religious institutions that have laid claim to them but don’t truly own them. Many of our soul-related needs are ready to be freed from the particular tint given to them by religions – even if, paradoxically, it is the study of religions which often holds the key to their rediscovery and rearticulation. Secularism is not wrong. It is just that we have too often secularised badly – inasmuch as, in the course of ridding ourselves of unfeasible ideas, we have surrendered unnecessarily many of the most useful and attractive parts of the faiths.

Our age has properly defined what the godshaped hole is. We now need to fill it. This means no longer adding to the already daunting pile of books about atheism, but starting instead to try to make some practical things happen in the world.

Alain de Botton is the author of “Religion for Atheists” (Penguin, £9.99)

The world cannot be disenchanted

Francis Spufford

When Thomas Paine was dying in Greenwich Village in June 1809, two Presbyterian ministers popped by to suggest that he would be damned if he didn’t affirm his faith in Jesus Christ. “Let me have none of your popish stuff,” he said firmly. “Good morning.” Score one to Paine for exiting the world without compromising his convictions, yet what he said had made, on the face of it, no sense.

Faith in Christ as the path to salvation isn’t “popish” in the sense of being particular to Roman Catholicism. Paine was speaking to a pair of impeccable Protestants. What he was doing here was to act as a very early adopter of a perception that would influence later atheist understandings of the world enormously. He was suggesting, in one charged and revealing insult, that the original Protestant critique of Catholicism should be extended to the whole of historic Christianity. All of it should be reformed away; all of it, absolutely all of it, deserved the contempt that zealous Puritans had once felt for indulgences and prayer beads and “priestcraft”.

This post-Christian puritanism, largely oblivious now of its history, is highly visible in the New Atheism of the 1990s and 2000s, and especially in Richard Dawkins’s The God Delusion. Strange indifference (except at the margins) to all religions except Christianity? Check. Sense of being locked in righteous combat with the powers of darkness? Check. Puritanism, it turns out, can float free of faith and still preserve a vehement world-view, a core of characteristic judgements. The world, it says, is afflicted by a layer of corrupting gunk, a gluey mass of lies and mistakes that purports to offer mediation between us and meaning but actually obscures it and hides the plain outlines of that truth we so urgently need. Moreover, this hiding, this obscuring, is wilful and culpable, maintained on purpose for the benefit of hierarchs, bullies, men in golden hats everywhere. It is our duty to take up the wire wool of reason and to scrub, scrub, scrub the lies away. For no mediation is necessary. We may have –we must have – a direct vision of the essential state of things. We must see the world as if through pure, clear water, or empty air.

It is reassuring, in a way, to find this ancient continuity at work in the sensibility of Dawkins, Sam Harris, Daniel Dennett and Jerry Coyne. It kind of makes up for their willed ignorance of all the emotional and intellectual structures of faith (as opposed to the will-o’-the-wisp “popery” in their heads). Dawkins may be showing indifference to every word ever written about the differences between polytheism and monotheism when he declares that Yahweh is the same as Odin, and that all he wants “is one god less” – but he is also keeping up a 400-year-old campaign against idolatry. That distant sound you hear is Oliver Cromwell applauding.

However, the project is impossible – as impossible for the New Atheists as for every previous builder of a purified New Jerusalem. Direct, unmediated apprehension of truth is not available, except in the effortful special case of science. That gunk the New Atheists scrub at so assiduously is the inevitable matter of human culture, of imagination. People secrete it, necessarily, faster than it can be removed. Metaphors solidify into stories wherever the reformers’ backs are turned. We’ll never arrive at the Year Zero where everything means only what science says it should. Religion being a thing that humans as a species do continuously, it seems unlikely that we’ll stop, any more than we’ll stop making music, laws, poetry or non-utilitarian clothes to wear. Imagination grows as fast as bamboo in the rain. The world cannot be disenchanted. Even advocacy for disenchantment becomes, inexorably, comically, an enchantment of its own, with prophets, with heresies and with its own pious mythography.

I think our recent, tentative turn away from the burning simplicities of The God Delusion (and the like) represents a recognition of this. Alain de Botton’s discovery in religion of virtues and beauties that an atheist might want is an anti-puritan move, a reconciliation of unbelief with the sprouting, curling, twining fecundity of culture. I don’t expect the puritan call will lose its appeal to the young and the zealous, but maybe we are entering a phase of greater tolerance in which, having abandoned the impossible task of trying to abolish religion, atheists might be able to apply themselves to the rather more useful task of distinguishing between kinds that want to damn you and kinds that don’t.

Francis Spufford is the author of “Unapologetic: Why, Despite Everything, Christianity Can Still Make Surprising Emotional Sense” (Faber & Faber, £8.99)


Believing in a god is fine by me

Jim Al-Khalili

As a scientist, I have an unshakeable rationalist conviction that our universe is comprehensible; that mysteries are mysteries only because we have yet to figure them out. There is no need for a supernatural being to occupy the gaps in our understanding, because we will eventually fill them with new knowledge based on objective scientific truths: answers that are not based on mythologies, or cultural/historical whims, or personal biases, but arrived at by examining hypotheses, testing our theories to destruction and being prepared to abandon them if they conflict with empirical data. Scientists are constantly subjecting our world-view to scrutiny. This is the opposite of blind faith.

Such a sweeping statement is a little unfair, given that not all scientists are so prepared to abandon a dogmatic stance when proved wrong, and not everyone with religious faith follows it blindly – to think that they do is naive and insulting to the many people who constantly question their faith. If you hold a strong conviction that there is some deeper significance to the universe or a spiritual meaning to your life that is important to you, who am I to try to convince you otherwise?

Believing in a god is fine by me, if it is important to you. If you firmly believe this as an ontological truth, then it is rather pointless having a theological debate about it. But what I, and many other atheists, take issue with is the arrogant attitude that religious faith is the only means of providing us with a moral compass – that society dissolves without faith into a hedonistic, anarchic, amoral, self-gratifying decadence. This is not only nonsense, but intellectually lazy.

We still have a long way to go if we are to rid the world of the bigoted attitudes held and injustices carried out in the name of religion. But the tide is turning. I would argue that to be an atheist in Britain today is so mainstream that we can afford to become less strident in our criticism and more tolerant of those with a faith. I say this not because I am less committed to my secular views or because I have weaker conviction than others, but because I believe we are winning the argument. We should not have to defend our atheism any longer.

Don’t get the impression that I am arguing for complacency. It is just that here in the west we are now in a stronger position to change attitudes, to correct discriminatory laws and to make for a fairer society in which religion does not give one group an advantage or special privileges.

Our society is no longer predominantly religious. Atheists are the mainstream. This is precisely why we should set out our stall to be more tolerant and inclusive. There are many issues on which we cannot afford to be complacent or conciliatory, such as the evil intent of religious fanatics, the wrong-headedness of creationists or the many injustices carried out against women or minority groups in the name of barbaric medieval laws, but we can often be more effective in getting our message across with a softer approach. The New Atheists have laid the foundations; maybe it is time now for the “New, New Atheists”.

I am well aware that some other atheists would call me an accommodationist. However, this patronising term needs to be replaced, so I have thought long and hard in search of an alternative – a more appropriate one to define my brand of atheism – until I realised it has been under my nose all the time: it is called being a humanist.

Jim Al-Khalili is the president of theHumanist Association and the author of “Quantum: a Guide for the Perplexed” (Phoenix, £10.99)

The biblical God is a starter kit

Karen Armstrong

Most of us are introduced to God at about the same time as we hear about Santa Claus, but over the years our views of Santa mature and change, while our notion of God often gets stuck at an infantile level.

As a result, “God” becomes incredible. Despite our scientific and technological brilliance, our religious thinking in the west is often remarkably undeveloped, even primitive, and would make Maimonides and Aquinas turn in their graves. They both insisted that God was not another being and that you could not even say that He (ridiculous pronoun!) existed, because our experience of existence is too limited. God, said Aquinas, is Being itself (esse se ipsum).

The biblical God is a “starter kit”; if we have the inclination and ability, we are meant to move on. Throughout history, however, many people have been content with a personalized deity, yet not because they “believed” in it but because they learned to behave – ritually and ethically – in a way that made it a reality. Religion is a form of practical knowledge, like driving or dancing. You cannot learn to drive by reading the car manual or the Highway Code; you have to get into the vehicle and learn to manipulate the brakes. The rules of a board game sound obscure and dull until you start to play, and then everything falls into place. There are some things that can be learned only by constant, dedicated practice. You may learn to jump higher and with more grace than seems humanly possible or to dance with unearthly beauty. Some of these activities bring indescribable joy –what the Greeks called ekstasis, a “stepping outside” the norm.

Religion, too, is a practical discipline in which we learn new capacities of mind and heart. Like premodern philosophy, it was not the quest for an abstract truth but a practical way of life. Usually religion is about doing things and it is hard work. Classical yoga was not an aerobic exercise but a full-time job, in which a practitioner learned to transcend the ego that impeded the ekstasis of enlightenment. The five “pillars” or essential practices of Islam are all activities: prayer, pilgrimage, almsgiving, fasting and a continual giving of “witness” (shahada) in everything you do that God (not the “gods” of ambition and selfishness) is your chief priority.

The same was once true of Christianity. The Trinity was not a “mystery” because it was irrational mumbo-jumbo. It was an “initiation” (musterion), which introduced Greek-speaking early Christians to a new way of thinking about the divine, a meditative exercise in which the mind swung in a disciplined way from what you thought you knew about God to the ineffable reality. If performed correctly it led to ekstasis. As Gregory of Nazianzus (329-90) explained to his Christian initiates: “My eyes are filled and the greater part of what I am thinking escapes me.” Trinity was, therefore, an activity rather than a metaphysical truth in which one credulously “believed”. It is probably because most western Christians have not been instructed in this exercise that the Trinity remains pointless, incomprehensible, and even absurd.

If you don’t do religion, you don’t get it. In the modern period, however, we have turned faith into a head-trip. Originally, the English word “belief”, like the Greek pistis and the Latin credo, meant “commitment”. When Jesus asked his followers to have “faith”, he was not asking them to accept him blindly as the Second Person of the Trinity (an idea he would have found puzzling). Instead, he was asking his disciples to give all they had to the poor, live rough and work selflessly for the coming of a kingdom in which rich and poor would sit together at the same table.

Credo ut intellegam – I commit myself in order that I may understand,” said Saint Anselm (1033-1109). In the late 17th century, the English word “belief” changed its meaning and became the intellectual acceptance of a somewhat dubious proposition. Religious people now think that they have to “believe” a set of incomprehensible doctrines before embarking on a religious way of life. This makes no sense. On the contrary, faith demands a disciplined and practical transcendence of egotism, a “stepping outside” the self which brings intimations of transcendent meaning that makes sense of our flawed and tragic world.

Karen Armstrong is the author of “The Case for God: What Religion Really Means” (Vintage, £9.99)

The word to grasp here is myth

Richard Holloway

No matter how they answer the God question, generous-minded people could profit from adopting an attitude of critical sympathy towards religion and maybe even taking the odd dip into it – provided they heed Canon William Vanstone’s warning that the Church is like a public swimming pool, where most of the noise comes from the shallow end.

Most religions have two main departments of thought. The first calls itself “natural” theology because it recognises that it is in the nature of human beings to ask ultimate questions about the universe in which they find themselves.

Apart from being more hopeful about finding positive answers to these questions than less committed searchers, natural theologians go over the same ground as philosophers and are no better at arriving at absolutely convincing conclusions than the philosophers are, which is why the exercise usually ends up at a kind of graded agnosticism that stretches from almost-atheism to almost-theism but never absolutely nails down either.

If you need personalities to define the gradations, Richard Dawkins fits the almostatheism end and Roger Scruton fits the almost-theism end. Incidentally, it is worth remembering that both of these thinkers are subtler in the positions they hold on these complex matters than most people give them credit for.

So far, so inconclusive. It is the next move in the religious enterprise that gets interesting. This is where theologians introduce the idea of revelation. “Revealed” theology is the department where we try not to figure out whether there is a god, but to work out the meaning of the messages that the god has sent us from beyond to answer the questions we are unable to answer. This is where sacred texts come into play, as well as the institutions that accrete round them to protect and promote them. Revelation is what you get when you go to the synagogue or church or the mosque – all those instructions from God to do this or abjure that – and it is where things can get both frustrating and interesting for unbelievers.

The big frustration is how to deal with the circularity of the claims that are made by the exponents of revealed theology. If you ask them how they know that the words they quote came from God and not just another human being, the answer comes back, “Because the Bible or the Quran or the Whatever tells us so” – and we are no further on.

A good approach here is not to try to stop the revelation argument from going round and round but to ask a different question, thus: given that there probably is no God, where did all this stuff come from? To which the obvious answer is that it came from us. All these sacred texts are creations of the human imagination, works of art crafted by us to convey meaning through story.

So it’s a mistake to do what most unbelievers usually do at this point, which is to dismiss them as fairy tales and thereby deprive themselves of a rich resource for exploring the heights and depths of the human condition. The word to grasp here is myth: a myth is a story that encodes but does not necessarily explain a universal human experience.

The wrong question to ask of a myth is whether it is true or false. The right question is whether it is living or dead, whether it still speaks to our condition. That is why, among all the true believers in church this Easter, there will be thousands of others who are there because they need, yet again, to express the hope that good need not always be defeated by evil.

Richard Holloway was the bishop of Edinburgh from 1986 to 2000. He is the author of “Leaving Alexandria: a Memoir of Faith and Doubt” (Canongate, £8.99)

This article first appeared in the 25 March 2013 issue of the New Statesman, After God

Show Hide image

Q&A: What are tax credits and how do they work?

All you need to know about the government's plan to cut tax credits.

What are tax credits?

Tax credits are payments made regularly by the state into bank accounts to support families with children, or those who are in low-paid jobs. There are two types of tax credit: the working tax credit and the child tax credit.

What are they for?

To redistribute income to those less able to get by, or to provide for their children, on what they earn.

Are they similar to tax relief?

No. They don’t have much to do with tax. They’re more of a welfare thing. You don’t need to be a taxpayer to receive tax credits. It’s just that, unlike other benefits, they are based on the tax year and paid via the tax office.

Who is eligible?

Anyone aged over 16 (for child tax credits) and over 25 (for working tax credits) who normally lives in the UK can apply for them, depending on their income, the hours they work, whether they have a disability, and whether they pay for childcare.

What are their circumstances?

The more you earn, the less you are likely to receive. Single claimants must work at least 16 hours a week. Let’s take a full-time worker: if you work at least 30 hours a week, you are generally eligible for working tax credits if you earn less than £13,253 a year (if you’re single and don’t have children), or less than £18,023 (jointly as part of a couple without children but working at least 30 hours a week).

And for families?

A family with children and an income below about £32,200 can claim child tax credit. It used to be that the more children you have, the more you are eligible to receive – but George Osborne in his most recent Budget has limited child tax credit to two children.

How much money do you receive?

Again, this depends on your circumstances. The basic payment for a single claimant, or a joint claim by a couple, of working tax credits is £1,940 for the tax year. You can then receive extra, depending on your circumstances. For example, single parents can receive up to an additional £2,010, on top of the basic £1,940 payment; people who work more than 30 hours a week can receive up to an extra £810; and disabled workers up to £2,970. The average award of tax credit is £6,340 per year. Child tax credit claimants get £545 per year as a flat payment, plus £2,780 per child.

How many people claim tax credits?

About 4.5m people – the vast majority of these people (around 4m) have children.

How much does it cost the taxpayer?

The estimation is that they will cost the government £30bn in April 2015/16. That’s around 14 per cent of the £220bn welfare budget, which the Tories have pledged to cut by £12bn.

Who introduced this system?

New Labour. Gordon Brown, when he was Chancellor, developed tax credits in his first term. The system as we know it was established in April 2003.

Why did they do this?

To lift working people out of poverty, and to remove the disincentives to work believed to have been inculcated by welfare. The tax credit system made it more attractive for people depending on benefits to work, and gave those in low-paid jobs a helping hand.

Did it work?

Yes. Tax credits’ biggest achievement was lifting a record number of children out of poverty since the war. The proportion of children living below the poverty line fell from 35 per cent in 1998/9 to 19 per cent in 2012/13.

So what’s the problem?

Well, it’s a bit of a weird system in that it lets companies pay wages that are too low to live on without the state supplementing them. Many also criticise tax credits for allowing the minimum wage – also brought in by New Labour – to stagnate (ie. not keep up with the rate of inflation). David Cameron has called the system of taxing low earners and then handing them some money back via tax credits a “ridiculous merry-go-round”.

Then it’s a good thing to scrap them?

It would be fine if all those low earners and families struggling to get by would be given support in place of tax credits – a living wage, for example.

And that’s why the Tories are introducing a living wage...

That’s what they call it. But it’s not. The Chancellor announced in his most recent Budget a new minimum wage of £7.20 an hour for over-25s, rising to £9 by 2020. He called this the “national living wage” – it’s not, because the current living wage (which is calculated by the Living Wage Foundation, and currently non-compulsory) is already £9.15 in London and £7.85 in the rest of the country.

Will people be better off?

No. Quite the reverse. The IFS has said this slightly higher national minimum wage will not compensate working families who will be subjected to tax credit cuts; it is arithmetically impossible. The IFS director, Paul Johnson, commented: “Unequivocally, tax credit recipients in work will be made worse off by the measures in the Budget on average.” It has been calculated that 3.2m low-paid workers will have their pay packets cut by an average of £1,350 a year.

Could the government change its policy to avoid this?

The Prime Minister and his frontbenchers have been pretty stubborn about pushing on with the plan. In spite of criticism from all angles – the IFS, campaigners, Labour, The Sun – Cameron has ruled out a review of the policy in the Autumn Statement, which is on 25 November. But there is an alternative. The chair of parliament’s Work & Pensions Select Committee and Labour MP Frank Field has proposed what he calls a “cost neutral” tweak to the tax credit cuts.

How would this alternative work?

Currently, if your income is less than £6,420, you will receive the maximum amount of tax credits. That threshold is called the gross income threshold. Field wants to introduce a second gross income threshold of £13,100 (what you earn if you work 35 hours a week on minimum wage). Those earning a salary between those two thresholds would have their tax credits reduced at a slower rate on whatever they earn above £6,420 up to £13,100. The percentage of what you earn above the basic threshold that is deducted from your tax credits is called the taper rate, and it is currently at 41 per cent. In contrast to this plan, the Tories want to halve the income threshold to £3,850 a year and increase the taper rate to 48 per cent once you hit that threshold, which basically means you lose more tax credits, faster, the more you earn.

When will the tax credit cuts come in?

They will be imposed from April next year, barring a u-turn.

Anoosh Chakelian is deputy web editor at the New Statesman.