Show Hide image

A last chance

Leaders meet in Washington on 15 November for a summit to attempt to resuscitate a world finance sys

The giant video screen at 745 Seventh Avenue, in Manhattan, is still lit up: only now, instead of the old Lehman Brothers promo, with its tossing oceans and desert sunsets, it projects the ice-blue bling of Barclays Capital, five-storeys high. The problem is, though the lights are still on for finance capital, ideologically there’s nobody home.

Lehman's bankruptcy marked the end of a 20-year experiment in financial deregulation. But it was Alan Greenspan's congressional testimony, a month later, that marked the collapse of something bigger: the neoliberal ideology that has underpinned it all.

It was Greenspan who had begun ripping away restrictions on financial speculation and investment banking in 1987. Last month, he said: "I have found a flaw. I don't know how significant or permanent it is. But I have been very distressed by that fact . . . Those of us who have looked to the self-interest of lending institutions to protect shareholders' equity, myself especially, are in a state of shock and disbelief."

The belief in self-interest as the guiding principle of commerce is as old as Adam Smith. What happened with the Anglo-Saxon model of capitalism was something different: the principle of rational self-interest was elevated to replace regulation and the state. Selfishness became a virtue. Inspired by Ayn Rand's credo - "I will never live for the sake of another man, nor ask another man to live for mine" - the giants of global finance revelled in amoralism. Morgan Stanley boss John Mack's legendary trading-floor motto - "There's blood in the water, let's go kill somebody" - sums up the era.

But the theory was flawed. Instead of safeguarding the property of shareholders, self-regulation drove the system to the point of collapse. Trillions of dollars worth of capital has been destroyed. "My view of the range of dispersion of outcomes has been shaken," Greenspan conceded. That's a logical response when the range of outcomes is clustered around the collapse of the savings system, the evaporation of global credit and the bankruptcy of most banks.

But selfishness was not the only tenet of neo-liberalism. Any definition of the term would include: a belief in the market as the only guarantor of prosperity and democracy; the futility of state intervention in pursuit of social justice; the creative destruction of cherished institutions and stable communities; the shrinkage of the state to regulatory functions only, and then as minimal as possible.

And the problem for the G20 leaders who will assemble at the Washington summit on 15 Nov ember is this: every single one of them has, to a greater or lesser extent, bought into the neoliberal ideology. It has dictated the direction of travel even in economies such as Brazil, India, Indonesia and China, classified as "mostly unfree" on the neoliberal league table.

The summit's most pressing task is to come up with a co-ordinated crisis response: for all the rhetoric, this is a firefighting operation not a second Bretton Woods. In the end, the route to a Bretton Woods-style settlement may be impassable for the weakened, multi-polar capitalism represented by the G20. But, even to begin that journey, there must be an honest reckoning with neoliberalism.

An ideology does three things: it justifies the economic dominance of a ruling group; it is transmitted through that group's control of the media and education; and it describes the experience of millions of people accurately enough for them to accept it as truth. But it does not have to be logical. For this reason, picking logical flaws in neoliberalism has been an exercise with diminishing returns.

For example, Milton Friedman's assertion that free-market capitalism and democracy are mutually reinforcing always looked a non-sequitur after he hotfooted it to Chile in 1975, personally urging General Pinochet to inflict a neoliberal economic "shock", even as the secret police were administering electric shocks to the genitals of oppositionists. But his theories continued to inspire policymakers.

Instead of logic, any balance sheet of neoliberalism has to begin from its outcomes. I will list five negative outcomes for countries following the Anglo-Saxon model:

In the first place, rising inequality. Between 1947 and 1973 the income of the poorest fifth of US families grew 116 per cent, higher than any other group. From 1974 to 2004 it grew by just 2.8 per cent. In the UK, the share of national income received by the bottom 10 per cent fell from 4.2 per cent in 1979 to 2.7 per cent in 2002.

Second, the replacement of high wages by high debt. The real wages of the average American male worker are today below what they were in 1979; and for the poorest 20 per cent, much lower. In 1979, personal household debt was 46 per cent of America's GDP; now it is 98 per cent. In the UK, real household incomes grew, but slower than in the postwar boom, until early this decade, since when they have fallen. The debt pattern, however, followed the US; 30 years ago British households were in debt to the tune of 20 per cent of GDP, now it is 80 per cent.

Third is the redistribution of profits from non-financial companies to the finance sector. In 1960s America, the pretax profits of financial firms made up 14 per cent of corporate profits; now they make up 39 per cent. Most of this profit is not generated from financing productive business: the world's total stock of financial assets is three times as large as global GDP. In 1980, it was about equal to GDP.

The new power of finance capital not only creates asset bubbles, as with the dotcom, housing and commodity bubbles of the past decade, but it allows speculative capital to descend on individual companies, countries and industry sectors, smash them and move on. I present the current economic plight of Hungary as Exhibit A.

Fourth is the growth of personal and financial insecurity, the destruction of social capital and the resulting rise in crime. If you want data, then the four stark pages of membership graphs at the end of Robert Putnam's celebrated book Bowling Alone show the decline of almost every voluntary association in America during the neoliberal age. If you prefer qualitative research, walk the streets of any former industrial city at night.

Fifth is the relentless commercialisation of all forms of human life: the privatisation of drinking water that provoked the people of Coch abamba, Bolivia to revolt in 2000; the creation of a private army of 180,000 military contractors in Iraq, unaccountable to international law. In these and many other instances, the functions of the state have been turned over to private companies to the financial detriment of taxpayers, the material detriment of consumers and the loss of democratic accountability.

But there is a plus side. Since 1992, there has been stability and growth across the OECD countries and beyond, albeit lower than the average growth achieved during the postwar boom years. There has been a marked fall in absolute poverty, with the number of people living on less than $2 a day falling by 52 per cent in Asia, 30 per cent in Latin America (though rising by 3 per cent in Africa) between the years 1982-2002. And though the data is mixed, many of neoliberalism's critics accept that inequality declines as per-capita GDP growth improves.

There has been a huge movement of humanity from the farm to the factory, and 200 million people have migrated from the poor world to the rich. Access to the financial system has brought rising liquidity: access to homeownership and overdrafts for families on low pay was real, whatever its macroeconomic outcome. And above all, the musty cultural and institutional barriers that made life a misery for the young in the 1960s and 1970s are largely gone; the flipside of commoditisation has been the decline of dependency and paternalism in social life.

And this has been the source of neoliberalism's strength as an ideology: borrow big-time, negotiate your own salary, duck and dive, lock your door at night. That is the new way of life for the world's workforce. My father's generation, the generation of organised workers which saw industry and social solidarity destroyed in the 1980s, could never really accept it. But hundreds of millions of people under the age of 40 know nothing else. And if you live in a Kenyan slum or a Shenzhen factory, you have seen your life chances rise spectacularly higher than those of your father's generation, even if the reverse is true in, say, Salford or Detroit.

Until 15 September 2008 (the day Lehman Brothers filed for Chapter 11 bankruptcy protection, the largest bankruptcy in US history), the left and the right were engaged in a political debate that revolved around the balance of these positive and negative impacts. Today that debate is over: we now know that neoliberalism nearly crashed the whole financial system. I will repeat, because the adrenalin rush has subsided and it is easy to forget: neoliberalism brought the world to the brink of an economic nuclear winter. Not by accident but because of a flaw in its central mechanism. It is for this reason that President Sarkozy (once labelled "Monsieur Thatcher" by the French left) declared it dead - not flawed - but dead. "The idea that markets were always right was mad . . . The present crisis must incite us to re-found capitalism on the basis of ethics and work . . . Laissez-faire is finished. The all-powerful market that always knows best is finished," he said.

S o what comes next? Though governments are scrambling to deploy Keynesian anti-crisis measures - from George Bush's tax cut to Gordon Brown's borrowing hike - it is axiomatic that the developed world cannot return to the way things were before the 1980s: the Keynesian model broke down spectacularly, and could cure neither high inflation nor economic stagnation.

It is clear, from the sheer level of pain and trauma inflicted by the changes of the past 20 years, that we have lived through the birth of something. Its founding ideology was neoliberalism; its most tangible result was globalisation; and it was achieved through class struggle by the rich against the poor. But none of these facts can encompass the scale of change.

Because none of them allows for the most fundamental change - that information has become a primary factor of production. C omputing power has doubled every 24 months; the internet and mobile telephony have, in the past ten years, altered the patterns of human life more profoundly than any single economic policy. Info-capitalism has been inadequately theorised: call it "post-Fordism", techno-capital or the knowledge economy, whatever the label it remains the central fact of the early 21st century.

If you accept this, then the experience of neo- liberalism looks less like the dawn of a free-market empire, more like the period between the invention of the factory system and the passing of the first effective factory legislation: between the establishment of Arkwright's mill at Cromford in 1771 and the Factory Act of 1844.

For much of that period, the pioneers of industrial capitalism believed that any regulation would kill the dynamism of the system. They too had a celebrity economist to justify their actions, namely Nassau Senior, the author of the theory that all profits were made in the last hour of the working day. The fate of capitalism, quipped reformer William Cobbett, depended on 300,000 little girls in Lancashire: "For it was asserted, that if these little girls worked two hours less per day, our manufacturing superiority would depart from us."

Child labour was abolished; minimal standards of order and humanity were imposed on the factories. But capitalism did not die - it took off. It is no accident, incidentally, that 1844 was also the year Britain, traumatised by recurrent financial panics, enshrined the supremacy of the central bank and the gold standard in legislation. If the parallel is valid, then the new regulations and institutions under discussion in Washington stand a chance not of killing info-capitalism but of unleashing it.

What are the intellectual sources for the system that will replace neoliberalism? Most of the prophets of doom in advance of the credit crunch were survivors from the Keynesian era: Paul Krugman, Joseph Stiglitz, George Soros, Nouriel Roubini, Morgan Stanley economist Stephen Roach. But with the partial exception of Stiglitz, they remain dislocated from the grass-roots opposition to neoliberalism. In turn, this opposition, dominated by the principles of anarchism and charity, has revelled in its own diversity and lack of engagement with state-level solutions.

As for the world's policymakers they, for now, resemble the Hoover administration in 1930, or if you are feeling really unkind, Chamberlain's British government in 1940. They are confronted by a crisis they did not think would happen. They are approaching it with the only tools they have - but they are the old tools: the old alliances, the old experts, the unreconstructed ideas and plans: Doha, Basel II, the Lisbon agenda. The IMF's conditions for bailing out eastern Europe - public spending cuts, interest-rate rises, privatisations - confirm the pattern.

The aim, made explicit during a speech on 28 October by Catherine Ashton, the EU's new trade commissioner, is to enact crisis measures while explaining to the public that "interventions and excessive use of public subsidies - while attractive today, will damage us tomorrow". This does not match the rhetoric coming out of Paris and Washington about the "end of trickle-down" and the death of laissez-faire; and it tends to ignore the fact that the most fundamental problem created by neoliberalism was not deregulation but the replacement of high wages by high debt. In other words, it is not the policy framework that is in trouble, it is the growth model.

There are three possible ways out. First, the revival of neoliberalism in a hair shirt: less addicted to the celebration of greed; with government spending temporarily replacing consumer debt as the driver of demand; and with some attempt at co-ordinated re-regulation. That is the maximum that can come out of the Washington summit.

Second, the abandonment of a high-growth economy: if it can't be driven by wages, debt or public spending then it can't exist. And if it can't exist in America, then Asia's model of high exports and high savings does not work, either. In previous eras the proposal to revert to a low-growth economy would have been regarded as simply barbarism and regression. Yet there is a strong sentiment among the anti-globalist and deep-green activists in favour of this solution, and it has found echoes in mass consciousness and micro-level consumer behaviour as the world has come to understand the dangers of global warming. Even a mainstream corporate economist, such as Morgan Stanley's Roach, has called for "a greater awareness of the consequences of striving for open-ended economic growth . . . This crisis is a strong signal that [high-growth] strategies are not sustainable."

The third alternative is the Minsky option. Hyman Minsky (1919-1996) was the godfather of modern financial crisis theory: his works, while largely ignored by politicians, are revered by both Marxists and hedge-fund managers. The "Minsky Moment" - a systemic financial crisis that crashes the real-world economy - was not only predicted in his work but theorised as a natural and intrinsic feature of capitalism. What we are going through now, Minsky argued, is the normal consequence of achieving growth and full employment through an unfettered private financial system.

But he had a solution - outlined in the chapters the hedge-fund managers skip and the Marxists dismiss: the socialisation of the banking system. This, he conceived, not as an anti-capitalist measure but as the only possible form of a high- consumption, stable capitalism in the future.

Minsky argued: "As socialisation of the towering heights is fully compatible with a large, growing and prosperous private sector, this high-consumption synthesis might well be conducive to greater freedom for entrepreneurial ability and daring than is our present structure."

Minsky never bothered to spell out the details of how it might be done. But there is no need to, now.

Stumbling through the underground passageways of 10 Downing Street on the morning of 8 October, I saw it done. Tetchy and bleary-eyed, fuelled by stale coffee and takeaway Indian food, British civil servants had designed and executed it in the space of 48 hours. Within days, much of the western world's banking system had been stabilised by massive injections of taxpayer credit and capital.

The problem is, though they have now been there, done that, the G20 politicians have no desire to get the T-shirt.

The G20 summit will meet in the context of a global finance system on life support. Their impulse is to get it off the respirator as quickly as possible; to put things back to normal. But the ecosystem which sustained global finance in its previous form is also in crisis: easy credit and speculative finance were the oxygen, and they have gone.

The policy challenge, in short, is much more fundamental than is being recognised in the run-up to the G20 summit. Gordon Brown speaks of a "new global order" emerging out of Washington. But in reality he is talking about multilateral crisis-resolution mechanisms, not a rethink of the relationship between finance capital, growth and debt.

If the world's leaders seriously intend to "refound capitalism on the basis of ethics and work", there is plenty of source material to start brainstorming from.

I would throw this into the mix, from Franklin Roosevelt's Oval Office in January 1934: "Americans must forswear that conception of the acquisition of wealth which, through excessive profits, creates undue private power over private affairs and, to our misfortune, over public affairs as well. In building toward this end we do not destroy ambition . . . But we do assert that the ambition of the individual to obtain a proper security, a reasonable leisure, and a decent living throughout life is an ambition to be preferred to the appetite for great wealth and great power."

Paul Mason is economics editor of BBC Newsnight; his book "Meltdown: the End of the Age of Greed" is published by Verso in April 2009

Road to the summit

The summit, due to start in Washington DC on 15 November, was first proposed by President Nicolas Sarkozy of France at the United Nations General Assembly debate on 23 September. He urged reform of international institutions, warning that "the 21st-century world cannot be governed with institutions of the 20th century".

On 3 October the US enacted its $700bn bank bailout. EU leaders had initially been confident that their economies would be sufficiently resilient, but the world stock-market collapse now convinced them otherwise. The next day, France, Britain, Germany and Italy agreed to work together to support financial institutions, and issued a joint call for a G8 summit.

On 8 October Gordon Brown announced a rescue package for UK banks. Leaders of the G7 countries (the US, Japan, Germany, Britain, France, Italy and Canada) met in Washington on 11 October, issuing a five-point plan. On 12 October, in Paris, Sarkozy held an emergency meeting of the 15 eurozone leaders, the first such meeting since the launch of the euro. Unusually, Gordon Brown was also invited to attend, and a rescue plan based on the UK model was agreed. The following week, George Bush went to Italy, Germany and the UK in a bid to coordinate response to the turmoil, and on 14 October announced crisis talks to be held between Bush, Sarkozy and President José Manuel Barroso of the European Commission at Camp David four days later. Here, plans for the summit were unveiled.

Attending will be leaders of the G20, which includes the G7 and major developing nations such as China, India and Brazil, along with the head of the IMF and other international institutions. Of the African nations, only South Africa, the continent's biggest economy, will attend; on 27 October the African Union announced it would hold its own summit in response to the crisis.

Alyssa McDonald

This article first appeared in the 10 November 2008 issue of the New Statesman, Change has come

Getty
Show Hide image

“I felt so frantic I couldn’t see my screen”: why aren’t we taking mental health sick days?

Some employees with mental health problems fake reasons for taking days off, or struggle in regardless. What should companies be doing differently?

“I would go to the loo and just cry my eyes out. And sometimes colleagues could hear me. Then I would just go back to my desk as if nothing had happened. And, of course, no one would say anything because I would hide it as well as I could.”

How many times have you heard sobbing through a work toilet door – or been the person in the cubicle?

Jaabir Ramlugon is a 31-year-old living in north London. He worked in IT for four years, and began having to take time off for depressive episodes after starting at his company in 2012. He was eventually diagnosed with borderline personality disorder last January.

At first, he would not tell his employers or colleagues why he was taking time off.

“I was at the point where I was in tears going to work on the train, and in tears coming back,” he recalls. “Some days, I just felt such a feeling of dread about going into work that I just physically couldn’t get up ... I wouldn’t mention my mental health; I would just say that my asthma was flaring up initially.”

It wasn’t until Ramlugon was signed off for a couple of months after a suicide attempt that he told his company what he was going through. Before that, a “culture of presenteeism” at his work – and his feeling that he was “bunking off” because there was “nothing physically wrong” – made him reluctant to tell the truth about his condition.

“I already felt pretty low in my self-esteem; the way they treated me amplified that”

Eventually, he was dismissed by his company via a letter describing him as a “huge burden” and accusing him of “affecting” its business. He was given a dismissal package, but feels an alternative role or working hours – a plan for a gradual return to work – would have been more supportive.

“I already felt pretty low in my self-esteem. The way they treated me definitely amplified that, especially with the language that they used. The letter was quite nasty because it talked about me being a huge burden to the company.”

Ramlugon is not alone. Over three in ten employees say they have experienced mental health problems while in employment, according to the Chartered Institute of Personnel and Development. Under half (43 per cent) disclose their problem to their employer, and under half (46 per cent) say their organisation supports staff with mental health problems well.

I’ve spoken to a number of employees in different workplaces who have had varying experiences of suffering from mental ill health at work.

***

Taking mental health days off sick hit the headlines after an encouraging message from a CEO to his employee went viral. Madalyn Parker, a web developer, informed her colleagues in an out-of-office message that she would be taking “today and tomorrow to focus on my mental health – hopefully I’ll be back next week refreshed and back to 100 per cent”.

Her boss Ben Congleton’s reply, which was shared tens of thousands of times, personally thanked her – saying it’s “an example to us all” to “cut through the stigma so we can bring our whole selves to work”.

“Thank you for sending emails like this,” he wrote. “Every time you do, I use it as a reminder of the importance of using sick days for mental health – I can’t believe this is not standard practice at all organisations.”


Congleton went on to to write an article entitled “It’s 2017 and Mental Health is still an issue in the workplace”, arguing that organisations need to catch up:

“It’s 2017. We are in a knowledge economy. Our jobs require us to execute at peak mental performance. When an athlete is injured they sit on the bench and recover. Let’s get rid of the idea that somehow the brain is different.”

But not all companies are as understanding.

In an investigation published last week, Channel 5 News found that the number of police officers taking sick days for poor mental health has doubled in six years. “When I did disclose that I was unwell, I had some dreadful experiences,” one retired detective constable said in the report. “On one occasion, I was told, ‘When you’re feeling down, just think of your daughters’. My colleagues were brilliant; the force was not.”

“One day I felt so frantic I couldn’t see my screen”

One twenty-something who works at a newspaper echoes this frustration at the lack of support from the top. “There is absolutely no mental health provision here,” they tell me. “HR are worse than useless. It all depends on your personal relationships with colleagues.”

“I was friends with my boss so I felt I could tell him,” they add. “I took a day off because of anxiety and explained what it was to my boss afterwards. But that wouldn’t be my blanket approach to it – I don’t think I’d tell my new boss [at the same company], for instance. I have definitely been to work feeling awful because if I didn’t, it wouldn’t get done.”

Presenteeism is a rising problem in the UK. Last year, British workers took an average of 4.3 days off work due to illness – the lowest number since records began. I hear from many interviewees that they feel guilty taking a day off for a physical illness, which makes it much harder to take a mental health day off.

“I felt a definite pressure to be always keen as a young high-flyer and there were a lot of big personalities and a lot of bitchiness about colleagues,” one woman in her twenties who works in media tells me. “We were only a small team and my colleague was always being reprimanded for being workshy and late, so I didn’t want to drag the side down.”

Diagnosed with borderline personality disorder, which was then changed to anxiety and depression, she didn’t tell her work about her illness. “Sometimes I struggled to go to work when I was really sick. And my performance was fine. I remember constantly sitting there sort of eyeballing everyone in mild amusement that I was hiding in plain sight. This was, at the time, vaguely funny for me. Not much else was.

“One day I just felt so frantic I couldn’t see my screen so I locked myself in the bathroom for a bit then went home, telling everyone I had a stomach bug so had to miss half the day,” she tells me. “I didn’t go in the next day either and concocted some elaborate story when I came back.”

Although she has had treatment and moved jobs successfully since, she has never told her work the real reason for her time off.

“In a small company you don’t have a confidential person to turn to; everyone knows everyone.”

“We want employers to treat physical and mental health problems as equally valid reasons for time off sick,” says Emma Mamo, head of workplace wellbeing at the mental health charity Mind. “Staff who need to take time off work because of stress and depression should be treated the same as those who take days off for physical health problems, such as back or neck pain.”

She says that categorising a day off as a “mental health sick day” is unhelpful, because it could “undermine the severity and impact a mental health problem can have on someone’s day-to-day activities, and creates an artificial separation between mental and physical health.”

Instead, employers should take advice from charities like Mind on how to make the mental health of their employees an organisational priority. They can offer workplace initiatives like Employee Assistance Programmes (which help staff with personal and work-related problems affecting their wellbeing), flexible working hours, and clear and supportive line management.

“I returned to work gradually, under the guidance of my head of department, doctors and HR,” one journalist from Hertfordshire, who had to take three months off for her second anorexia inpatient admission, tells me. “I was immensely lucky in that my line manager, head of department and HR department were extremely understanding and told me to take as much time as I needed.”

“They didnt make me feel embarrassed or ashamed – such feelings came from myself”

“They knew that mental health – along with my anorexia I had severe depression – was the real reason I was off work ... I felt that my workplace handled my case in an exemplary manner. It was organised and professional and I wasn’t made to feel embarrassed or ashamed from them – such feelings came from myself.”

But she still at times felt “flaky”, “pathetic” and “inefficient”, despite her organisation’s good attitude. Indeed, many I speak to say general attitudes have to change in order for people to feel comfortable about disclosing conditions to even the closest friends and family, let alone a boss.

“There are levels of pride,” says one man in his thirties who hid his addiction while at work. “You know you’re a mess, but society dictates you should be functioning.” He says this makes it hard to have “the mental courage” to broach this with your employer. “Especially in a small company – you don’t have a confidential person to turn to. Everyone knows everyone.”

“But you can’t expect companies to deal with it properly when it’s dealt with so poorly in society as it is,” he adds. “It’s massively stigmatised, so of course it’s going to be within companies as well. I think there has to be a lot more done generally to make it not seem like it’s such a big personal failing to become mentally ill. Companies need direction; it’s not an easy thing to deal with.”

Until we live in a society where it feels as natural taking a day off for feeling mentally unwell as it does for the flu, companies will have to step up. It is, after all, in their interest to have their staff performing well. When around one in four people in Britain experience mental ill health each year, it’s not a problem they can afford to ignore.

If your manager doesn’t create the space for you to be able to talk about wellbeing, it can be more difficult to start this dialogue. It depends on the relationship you have with your manager, but if you have a good relationship and trust them, then you could meet them one-to-one to discuss what’s going on.

Having someone from HR present will make the meeting more formal, and normally wouldn’t be necessary in the first instance. But if you didn’t get anywhere with the first meeting then it might be a sensible next step.

If you still feel as though you’re not getting the support you need, contact Acas or Mind's legal line on 0300 466 6463.

Anoosh Chakelian is senior writer at the New Statesman.

This article first appeared in the 10 November 2008 issue of the New Statesman, Change has come