Show Hide image

Why the cult of hard work is counter-productive

From footballers’ work rates to the world of Big Data, the cult of “productivity” seems all-pervasive – but doing nothing might be the best thing for your well-being and your brain.

Loafing around can be an act of dissent against
the ceaseless demands of capitalism.
Illustration: Matt Murphy/Handsome Frank

Recently, I saw a man on the Tube wearing a Nike T-shirt with a slogan that read, in its entirety, “I’m doing work”. The idea that playing sport or doing exercise needs to be justified by calling it a species of work illustrates the colonisation of everyday life by the devotion to toil: an ideology that argues cunningly in favour of itself in the phrase “work ethic”.

We are everywhere enjoined to work harder, faster and for longer – not only in our jobs but also in our leisure time. The rationale for this frantic grind is one of the great unquestioned virtues of our age: “productivity”. The cult of productivity seems all-pervasive. Football coaches and commentators praise a player’s “work rate”, which is thought to compensate for a lack of skill. Geeks try to streamline their lives in and out of the office to get more done. People boast of being busy and exhausted and eagerly consume advice from the business-entertainment complex on how to “de-fry your burnt brain”, or engineer a more productive day by assenting to the horror of breakfast meetings.

A corporate guru will even teach you how to become a “master of extreme productivity”. (In these extreme times, extremity is always good; unless, perhaps, you are an extremist.) No one boasts of being unproductive, still less counterproductive. Into the iron gate of modernity have been wrought the words: “Productivity will set you free.”

Strategies to enhance the “productivity” of workers have been formalised since at least Frederick Winslow Taylor’s early-20th-century dream of “scientific management” through methods such as “time studies”. The latest wheeze is the Big Data field of “workforce science”, in which everything – patterns of emails, the length of telephone calls – may be measured and consigned to a comparative database to create a perfect management panopticon. It is tempting to suspect that the ambition thus to increase “worker productivity” is aimed at getting more work out of each employee for the same (or less) money.

To the long-evolving demands of productivity at work we must now add the burden of productivity everywhere else. As the Nike T-shirt’s slogan implies, even when we’re not at work, we must be doing work. There is certainly a great deal of Taylorised labour available on the internet: “sharing”, “liking” and updating profiles constitutes click-farm piecework for which we eagerly volunteer, to the profit of the large “social” media corporations.

Even for those who are not constantly bombarded with work demands outside the office, the ubiquity of information processing presents a temptation to be on call at all times. Our world has become an ambient factory from which there is no visible exit and there exists an industry of self-help technologies devoted to teaching us how to be happy workers. “Is information overload killing your productivity?” asks a representative business story. The answer is to adopt yet more productivity strategies. The labour of work is thus extended to encompass the labour of learning how to keep up with your work (specialised techniques, such as “Inbox Zero”, to manage the email tsunami) as well as the labour of recovering from your work in approved ways. 

“Exercise,” advises one business magazine feature. “It makes you more productive.” In a perfect world, you would be getting exercise while you work – standing desks and even treadmill desks are sold as magical productivity enhancers. In the future, we’ll enjoy the happy possibility of carrying on with our work while out running, thanks to “wearable computing” devices such as Google Glass, which has the potential to become the corporate equivalent of the electronic tags that record the movements of criminals.

In the vanguard of “productivity” literature and apps was David Allen’s “Getting Things Done” (GTD) system, according to which you can become “a wizard of productivity” by organising your life into folders and to-do lists. The GTD movement quickly spread outside the confines of formal work and became a way to navigate the whole of existence: hence the popularity of websites such as Lifehacker that offer nerdy tips on rendering the messy business of everyday life more amenable to algorithmic improvement. If you can discover how best to organise the cables of your electronic equipment or “clean stubborn stains off your hands with shaving cream”, that, too, adds to your “productivity” – assuming that you will spend the time that is notionally saved on a sanctioned “task”, rather than flopping down exhausted on the sofa and waking groggily seven hours later from what you were sternly advised should have been a power nap of exactly 20 minutes. If you need such “downtime”, it must be rigorously scheduled.

The paradox of the autodidactic productivity industry of GTD, Lifehacker and the endless reviews of obscure mind-mapping or task-management apps is that it is all too easy to spend one’s time researching how to acquire the perfect set of productivity tools and strategies without ever actually settling down to do something. In this way, the obsessive dream of productivity becomes a perfectly effective defence against its own realisation. 

As Samuel Johnson once wrote: “Some are always in a state of preparation, occupied in previous measures, forming plans, accumulating materials and providing for the main affair. These are certainly under the secret power of idleness. Nothing is to be expected from the workman whose tools are for ever to be sought.”

Nor is there any downward cut-off point for “our current obsession with busyness”, as one researcher, Andrew Smart, describes it in his intriguing book Autopilot: the Art and Science of Doing Nothing. Smart observes, appalled, a genre of literary aids for inculcating the discipline of “time management” in children. (Time is not amenable to management: it just keeps passing, whatever you do.) Not allowing children to zone out and do nothing, Smart argues, is probably harming their development. But buckling children into the straitjacket of time management from an early age might seem a sensible way to ensure an agreeably docile new generation of workers.

If so, the idea has history. In 1770, an anonymous essay on trade and commerce was published in London. (It is now usually attributed to a “J Cunningham”.) In it, the author proposes that orphans, “bastards and other accidental poor children” ought to be made to labour in workhouses for 12 hours a day from the age of four. (He allows that two of these hours might be devoted to learning to read.) This will have the happy effect, the author argues, of creating a new generation “trained up to constant labour” and thus increasing the general industry of the population, so that future labourers will be happy to earn in six days a week what they currently make in four or five.

Cunningham’s proposed workhouses are also conceived to house (or, rather, imprison) adult vagrants and other so-far-incorrigible poor people. Existing workhouses are too luxurious, he complains: “Such house must be made an house of terror”. Only terror will make the inmates properly productive; the solution is “the placing of the poor in such a situation that loss of liberty, hunger, thirst . . . should be the immediate consequences of idleness and debauchery”.

Fear has not ceased to be a useful spur to productivity. A recent article in the London newspaper Metro reported that research had shown that “dedicated Britons” were “less likely to pull a sickie” than workers in Germany and France. The researcher claimed: “Strong employment protection and generous sick pay was empirically found to contribute to increased staff sickness in Germany and France.” It could indeed be that Europeans are slackers and Brits are peculiarly “dedicated”. Or it could be that Britain’s more “flexible” labour market terrifies citizens into struggling into work even when they are ill.

The reason sickness is undesirable is not that it causes distress or discomfort but that it results in what is often called “lost productivity”. This is a sinister and absurd notion, predicated on the greedy fallacy of counting chickens before they have hatched. “Workplace absence through sickness was reported to cost British business £32bn a year,” the researcher claimed in Metro: a normal way of phrasing things today, but one with curious implications. The idea seems to be that business already has that money even though it hasn’t earned it yet and employees who fail to maintain “productivity” as a result of sickness or other reasons are, in effect, stealing this as yet entirely notional sum from their employers.

It took a long time before the adjective “productive” – which once simply meant “generative”, as applied to land or ideas – acquired its specific economic sense, in the late 18th century, of relating to the production of goods or commodities. (The noun form is first recorded by the Oxford English Dictionary in an essay by Samuel Taylor Coleridge, in which he writes of the “produc­tivity” of a growing plant.) To call a person “productive” only in relation to a measured quantity of physical outputs is another way that business rhetoric has long sought to dehumanise workers.

One way to counter this has been to attempt to recuperate the supposed vice of idleness – to hymn napping, daydreaming and sheer zoning out. Samuel Johnson is sometimes counted among the champions of faffing, perhaps simply because of the name of his essay series The Idler. Yet he looked sternly on occupying oneself with “trifles”, as he describes his dilettante friend Sober doing in one of those columns. The guiding principle of The Idler, as Johnson described it in the farewell essay, was to encourage readers “to view every incident with seriousness, and improve it by meditation”. So meditating seriously is not idleness. 

On the other hand, Johnson noted sagely in an earlier entry, one can be idle while appearing anything but: “There is no kind of idleness, by which we are so easily seduced, as that which dignifies itself by the appearance of business and by making the loiterer imagine that he has something to do which must not be neglected, keeps him in perpetual agitation and hurries him rapidly from place to place . . . To do nothing every man is ashamed and to do much almost every man is unwilling or afraid. Innumerable expedients have therefore been invented to produce motion without labour, and employment without solicitude.” Does this not perfectly describe our modern saturation in fatuous busywork? 

David Graeber, the anthropologist and author of Debt: the First 5,000 Years, would also probably approve of it as a characterisation of what he calls “bullshit jobs”. In a recent essay for Strike! magazine, Graeber remarks on “the creation of whole new industries like financial services or telemarketing, or the unprecedented expansion of sectors like corporate law, academic and health administration, human resources, and public relations”, all of which he describes as “bullshit” and “pointless”. Their activity is to be contrasted with that of what Graeber calls “real, productive workers”. 

It is telling that even in such a bracingly critical analysis, the signal virtue of “productivity” is left standing, though it is not completely clear what it means for the people in the “real” jobs that Graeber admires. It is true that service industries are not “productive” in the sense that their labour results in no great amount of physical objects, but then what exactly is it for the “Tube workers” Graeber rightly defends to be “productive”, unless that is shorthand for saying, weirdly, that they “produce” physical displacements of people? And to use “productive” as a positive epithet for another class of workers he admires, teachers, risks acquiescing rhetorically in the commercialisation of learning. Teaching as production is, etymologically and otherwise, the opposite of teaching as education. 

Idleness in the sense of just not working at all, rather than working at a bullshit activity, was championed by the dissident Marxist Paul Lafargue, writer of the 1883 manifesto The Right to Be Lazy. This amusing denunciation of what Lafargue calls “the furious passion for work” in capitalist civilisation, which is “the cause of all intellectual degeneracy”, rages against its own era of “overproduction” and consequent recurring “industrial crises”. The proletariat, Lafargue cries, “must proclaim the Rights of Laziness, a thousand times more noble and more sacred than the anaemic Rights of Man concocted by the metaphysical lawyers of the bourgeois revolution. It must accustom itself to working but three hours a day, reserving the rest of the day and night for leisure and feasting.”

That sounds nice but why exactly should we do it? It is because: “To force the capitalists to improve their machines of wood and iron, it is necessary to raise wages and diminish the working hours of the machines of flesh and blood.” Workers should refuse to work so that new gadgets get invented that will do the work for them. Similarly, Bert­rand Russell, in his 1932 essay “In Praise of Idleness”, argued that technology should make existing work patterns redundant: “Modern methods of production have given us the possibility of ease and security for all,” he wrote. Somewhere, he is still waiting for that possibility to be realised.

One modern anti-work crusader who cleanly abandons any notion of productivity is Federico Campagna, whose recent book The Last Night is an exercise in poetic dissidence. In seeking their existential justification in work, Campagna writes, “Humans elected their very submission to the throne as their new God.” Those who resist the siren promises of labour are therefore the true “radical atheists” and should be glad also to call themselves “squanderers”, “egoists”, “disrespectful opportunists”, “parasites” and most of all “adventurers”. Campagna explains: “Adventurers, like all humans, live within a dream, in which they try to be the lucid dreamers.” Something like dreaming or idling, it turns out, is also now sanctioned by another arena whose popular rhetoric often lays claim to a kind of religious authority: that of neuroscience. 

According to Andrew Smart’s book Autopilot, recent (but still controversial) brain research recommends that we stare vacantly into space more often. “Neuroscientific evidence argues that your brain needs to rest, right now,” Smart declares on the first page. (It took me a long time to finish the book, because I kept putting it down to have a break.)

Smart’s evidence suggests the existence of a “default network”, in which the brain gets busy talking to itself in the absence of an external task to focus on. To allow this “default network” to do its thing by regularly loafing around rather than switching focus all day between futile bits of work, Smart argues, is essential for the brain’s health. “For certain things the brain likes to do (for example, coming up with creative ‘outside of the box’ solutions),” he writes, “you may need to be doing very little.”

The poet Rainer Maria Rilke, Smart observes, was not very “productive” in terms of the quantity of poems he produced in an average year. However, while pootling away his time, he occasionally experienced a torrent of inspiration and what he did produce were works of greatness.

This reminds us that it is not necessary to abandon the notion of “productivity” altogether. We all like to feel that we have done something useful, interesting or fun with our day, even (or especially) if it has not been part of our official work, and we might harmlessly express such satisfaction by saying that our day has been productive.

This ordinary usage encodes an ordinary wisdom: that mere quantity of activity – as implied by the get-more-done mania of the productivity cult – has nothing to do with its value. Economics does not know how to value Rainer Maria Rilke over a prolific poetaster in receipt of an official laureateship. (One can be confident that, while mooching around European castles and writing nothing for years on end, Rilke would never have worn a T-shirt that announced: “I’m doing work”.) And his life sounds like more fun than one recent Lifehacker article, which eagerly explained how to organise your baseball cap collection by hanging the headwear on shower-curtain hooks arrayed along a rail.

Perhaps I shouldn’t mock. All that time saved every morning by knowing the exact location of the baseball cap you want to wear will surely add up, earning you hours more freedom to hunt and hoard ever more productivity tips, until you are a purely theoretical master at doing nothing of value in the most efficient way imaginable. 

Steven Poole’s “Who Touched Base in My Thought Shower? A Treasury of Unbearable Office Jargon” is published by Sceptre (£9.99)

reddit.com/user/0I0I0I0I
Show Hide image

We need to talk about the online radicalisation of young, white women

Alt-right women are less visible than their tiki torch-carrying male counterparts - but they still exist. 

In November 2016, the writer and TED speaker Siyanda Mohutsiwa tweeted a ground-breaking observation. “When we talk about online radicalisation we always talk about Muslims. But the radicalisation of white men online is at astronomical levels,” she wrote, inspiring a series of mainstream articles on the topic (“We need to talk about the online radicalisation of young, white men,” wrote Abi Wilkinson in The Guardian). It is now commonly accepted that online radicalisation is not limited to the work of Isis, which uses social media to spread propaganda and recruit new members. Young, white men frequently form alt-right and neo-Nazi beliefs online.

But this narrative, too, is missing something. When it comes to online radicalisation into extreme right-wing, white supremacist, or racist views, women are far from immune.

“It’s a really slow process to be brainwashed really,” says Alexandra*, a 22-year-old former-racist who adopted extreme views during the United States presidential election of 2016. In particular, she believed white people to be more intelligent than people of colour. “It definitely felt like being indoctrinated into a cult.”

Alexandra was “indoctrinated” on 4Chan, the imageboard site where openly racist views flourish, especially on boards such as /pol/. It is a common misconception that 4Chan is only used by loser, basement-dwelling men. In actuality, 4Chan’s official figures acknowledge 30 percent of its users are female. More women may frequent 4Chan and /pol/ than it first appears, as many do not announce their gender on the site because of its “Tits or GTFO” culture. Even when women do reveal themselves, they are often believed to be men who are lying for attention.

“There are actually a lot of females on 4chan, they just don't really say. Most of the time it just isn't relevant,” says Alexandra. Her experiences on the site are similar to male users who are radicalised by /pol/’s far-right rhetoric. “They sowed the seeds of doubt with memes,” she laughs apprehensively. “Dumb memes and stuff and jokes…

“[Then] I was shown really bullshit studies that stated that some races were inferior to others like… I know now that that’s bogus science, it was bad statistics, but I never bothered to actually look into the truth myself, I just believed what was told to me.”

To be clear, online alt-right radicalisation still skews majority male (and men make up most of the extreme far-right, though women have always played a role in white supremacist movements). The alt-right frequently recruits from misogynistic forums where they prey on sexually-frustrated males and feed them increasingly extreme beliefs. But Alexandra’s story reveals that more women are part of radical right-wing online spaces than might first be apparent.

“You’d think that it would never happen to you, that you would never hold such horrible views," says Alexandra. "But it just happened really slowly and I didn't even notice it until too late."

***

We are less inclined to talk about radical alt-right and neo-Nazi women because they are less inclined to carry out radical acts. Photographs that emerged from the white nationalist rally in Charlottesville this weekend revealed that it was mostly polo shirt-wearing young, white men picking up tiki torches, shouting racial slurs, and fighting with counter-protestors. The white supremacist and alt-right terror attacks of the last year have also been committed by men, not women. But just because women aren’t as visible doesn’t mean they are not culpable.  

“Even when people are alt-right or sympathisers with Isis, it’s a tiny percentage of people who are willing or eager to die for those reasons and those people typically have significant personal problems and mental health issues, or suicidal motives,” explains Adam Lankford, author of The Myth of Martyrdom: What Really Drives Suicide Bombers, Rampage Shooters, and Other Self-Destructive Killers.

“Both men and women can play a huge role in terms of shaping the radicalised rhetoric that then influences those rare people who commit a crime.”

Prominent alt-right women often publicly admit that their role is more behind-the-scenes. Ayla Stewart runs the blog Wife With a Purpose, where she writes about “white culture” and traditional values. She was scheduled to speak at the Charlottesville “Unite the Right” rally before dropping out due to safety concerns. In a blog post entitled “#Charlottesville May Have Redefined Women’s Roles in the Alt Right”, she writes:

“I’ve decided that the growth of the movement has necessitated that I pick and choose my involvement as a woman more carefully and that I’m more mindful to chose [sic] women’s roles only.”

These roles include public speaking (only when her husband is present), gaining medical skills, and “listening to our men” in order to provide moral support. Stewart declined to be interviewed for this piece.

It is clear, therefore, that alt-right women do not have to carry out violence to be radical or radicalised. In some cases, they are complicit in the violence that does occur. Lankford gives the example of the Camp Chapman attack, committed by a male Jordanian suicide bomber against a CIA base in Afghanistan.

“What the research suggests in that case was the guy who ultimately committed the suicide bombing may have been less radical than his wife,” he explains. “His wife was actually pushing him to be more radical and shaming him for his lack of courage.” 

***

Just because women are less likely to be violent doesn’t mean they are incapable of it.

Angela King is a former neo-Nazi who went to prison for her part in the armed robbery and assault of a Jewish shop owner. She now runs Life After Hate, a non-profit that aims to help former right-wing extremists. While part of a skinhead gang, it was her job to recruit other women to the cause.

“I was well known for the violence I was willing to inflict on others… often times the men would come up to me and say we don’t want to physically hurt a woman so can you take care of this,” King explains. “When I brought other women in I looked for the same qualities in them that I thought I had in myself.”

King's 1999 mugshot

 

These traits, King explains, were anger and a previous history of violence. She was 15 when she became involved with neo-Nazis, and explains that struggles with her sexuality and bullying had made her into a violent teenager.

“I was bullied verbally for years. I didn't fit in, I was socially awkward,” she says. One incident in particular stands out. Aged 12, King was physically bullied for the first time.

“I was humiliated in a way that even today I still am humiliated by this experience,” she says. One day, King made the mistake of sitting at a desk that “belonged” to a bully. “She started a fight with me in front of the entire class… I’ve always struggled with weight so I was a little bit pudgy, I had my little training bra on, and during the fight she ripped my shirt open in front of the entire class.

“At that age, having absolutely no self-confidence, I made the decision that if I became the bully, and took her place, I could never be humiliated like that again.”

Angela King, aged 18

King’s story is important because when it comes to online radicalisation, the cliché is that bullied, “loser” men are drawn to these alt-right and neo-Nazi communities. The most prominent women in the far-right (such as Stewart, and Lauren Southern, a YouTuber) are traditionally attractive and successful, with long blonde hair and flashing smiles. In actuality, women that are drawn to the movement online might be struggling, like King, to be socially accepted. This in no way justifies or excuses extreme behaviour, but can go some way to explaining how and why certain young women are radicalised. 

“At the age of 15 I had been bullied, raped. I had started down a negative path you know, experimenting with drugs, drinking, theft. And I was dealing with what I would call an acute identity crisis and essentially I was a very, very angry young woman who was socially awkward who did not feel like I had a place in the world, that I fit in anywhere. And I had no self-confidence or self-esteem. I hated everything about myself.”

King explains that Life After Hate’s research reveals that there are often non-ideological based precursors that lead people to far right groups. “Individuals don’t go to hate groups because they already hate everyone, they go seeking something. They go to fill some type of void in their lives that they’re not getting.”

None of this, of course, excuses the actions and beliefs of far-right extremists, but it does go some way to explaining how “normal” young people can be radicalised online. I ask Alexandra, the former 4Chan racist, if anything else was going on in her life when she was drawn towards extreme beliefs.

“Yes, I was lonely,” she admits.                                                       

***

That lonely men and women can both be radicalised in the insidious corners of the internet shouldn’t be surprising. For years, Isis has recruited vulnerable young women online, with children as young as 15 becoming "jihadi brides". We have now acknowledged that the cliché of virginal, spotty men being driven to far-right hate excludes the college-educated, clean-cut white men who made up much of the Unite the Right rally last weekend. We now must realise that right-wing women, too, are radicalised online, and they, too, are culpable for radical acts.  

It is often assumed that extremist women are radicalised by their husbands or fathers, which is aided by statements by far-right women themselves. The YouTuber, Southern, for example, once said:  

“Anytime they [the left] talk about the alt-right, they make it sound like it’s just about a bunch of guys in basements. They don’t mention that these guys have wives – supportive wives, who go to these meet-ups and these conferences – who are there – so I think it’s great for right-wing women to show themselves. We are here. You’re wrong.”

Although there is truth in this statement, women don’t have to have far-right husbands, brothers, or fathers in order to be drawn to white supremacist or alt-right movements. Although it doesn’t seem the alt-right are actively preying on young white women the same way they prey on young white men, many women are involved in online spaces that we wrongly assume are male-only. There are other spaces, such as Reddit's r/Hawtschwitz, where neo-Nazi women upload nude and naked selfies, carving a specific space for themselves in the online far-right. 

When we speak of women radicalised by husbands and fathers, we misallocate blame. Alexandra deeply regrets her choices, but she accepts they were her own. “I’m not going to deny that what I did was bad because I have to take responsibility for my actions,” she says.

Alexandra, who was “historically left-wing”, was first drawn to 4Chan when she became frustrated with the “self-righteousness” of the website Tumblr, favoured by liberal teens. Although she frequented the site's board for talking about anime, /a/, not /pol/, she found neo-Nazi and white supremacist beliefs were spread there too. 

“I was just like really fed up with the far left,” she says, “There was a lot of stuff I didn't like, like blaming males for everything.” From this, Alexandra became anti-feminist and this is how she was incrementally exposed to anti-Semitic and racist beliefs. This parallels the story of many radicalised males on 4Chan, who turn to the site from hatred of feminists or indeed, all women. 

 “What I was doing was racist, like I – deep down I didn't really fully believe it in my heart, but the seeds of doubt were sowed again and it was a way to fit in. Like, if you don't regurgitate their opinions exactly they’ll just bully you and run you off.”

King’s life changed in prison, where Jamaican inmates befriended her and she was forced to reassess her worldview. Alexandra now considers herself “basically” free from prejudices, but says trying to rid herself of extreme beliefs is like “detoxing from drugs”. She began questioning 4Chan when she first realised that they genuinely wanted Donald Trump to become president. “I thought that supporting Trump was just a dumb meme on the internet,” she says.

Nowadays, King dedicates her life to helping young people escape from far-right extremism. "Those of us who were involved a few decades ago we did not have this type of technology, cell phones were not the slim white phones we have today, they were giant boxes," she says. "With the younger individuals who contact us who grew up with this technology, we're definitely seeing people who initially stumbled across the violent far-right online and the same holds for men and women.

"Instead of having to be out in public in a giant rally or Klan meeting, individuals find hate online."

* Name has been changed

Amelia Tait is a technology and digital culture writer at the New Statesman.