View all newsletters
Sign up to our newsletters

Support 110 years of independent journalism.

  1. Science & Tech
7 September 2018updated 17 Jan 2024 7:05am

If democracy fails, Artificial Intelligence may help avoid a re-run of the 1930s

In a world where complexity is normal, AI could provide a more convincing narrative about how the world works than the stories that politicians tell us.

By Alex Bigham

There’s a tendency to think the era of fake news and cognitive dissonance was invented by Donald Trump, but the tendency to over-simplify and manipulate a confusing and interconnected web of seemingly random events is nothing new.

In his ambitious two and a half hour documentary HyperNormalisation, the filmmaker Adam Curtis considers how the discourse that informs our political narrative – as he puts it: “the stories that politicians tell us about the world” – increasingly fail to resonate with the public and struggle to match the complexities of modernity. Curtis points to the way in which Libya under Colonel Gaddafi, shifted from being a global pariah in the aftermath of Pan Am and the West Berlin nightclub bombings, despite many European intelligence agencies suggesting Syria was responsible, to a responsible global actor in the post-Iraq weapons of mass destruction deal, to a rogue state again in the fallout of the Arab spring.

Former US national security adviser Henry Kissinger talked about “constructive ambiguity” as a negotiating strategy in the Middle East conflict, and in many ways cognitive dissonance is hard wired into our evolutionary mind-set. Filtering out certain evidence isn’t just a way of reinforcing our own belief systems but enables us to make decisions more quickly – clearly a necessity in a hunter-gatherer society where survival depends on instinctive reactions.

Social media companies are rightly coming under increasing scrutiny for their tendency to reinforce blinkered belief systems, but the echo chamber is not new. In the early development of AI, “Eliza” was built as an elaborate in-joke by professor Joseph Weizenbaum at MIT who attempted to mimic the tendency of some psychologists to simply repeat back patient’s statements as a question. The response to Eliza shocked Weizenbaum: she proved to be engaging and popular with users. Judgment-free mirroring creates neural reward patterns in the same ways as likes on Facebook. The approach is being used today: online CBT training avoids the unpredictability of the patient-therapist relationship, while virtual reality systems that recreate the battlefield environment can help US veterans recover from PTSD.

One of the challenges of virtual systems is the absence of effective values. The idea of cyberspace as an ideological desert was brought to the fore with the ascendancy of the Facebook-inspired revolutions in Egypt and across the Middle East. While social media could bring together people behind a social cause to overthrow the system as it currently stood, the Google revolutionaries couldn’t agree on what came next. Into the void stepped the Muslim Brotherhood, who had a clear vision of how to organise society.

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com Our Thursday ideas newsletter, delving into philosophy, criticism, and intellectual history. The best way to sign up for The Salvo is via thesalvo.substack.com Stay up to date with NS events, subscription offers & updates. Weekly analysis of the shift to a new economy from the New Statesman's Spotlight on Policy team. The best way to sign up for The Green Transition is via spotlightonpolicy.substack.com
  • Administration / Office
  • Arts and Culture
  • Board Member
  • Business / Corporate Services
  • Client / Customer Services
  • Communications
  • Construction, Works, Engineering
  • Education, Curriculum and Teaching
  • Environment, Conservation and NRM
  • Facility / Grounds Management and Maintenance
  • Finance Management
  • Health - Medical and Nursing Management
  • HR, Training and Organisational Development
  • Information and Communications Technology
  • Information Services, Statistics, Records, Archives
  • Infrastructure Management - Transport, Utilities
  • Legal Officers and Practitioners
  • Librarians and Library Management
  • Management
  • Marketing
  • OH&S, Risk Management
  • Operations Management
  • Planning, Policy, Strategy
  • Printing, Design, Publishing, Web
  • Projects, Programs and Advisors
  • Property, Assets and Fleet Management
  • Public Relations and Media
  • Purchasing and Procurement
  • Quality Management
  • Science and Technical Research and Development
  • Security and Law Enforcement
  • Service Delivery
  • Sport and Recreation
  • Travel, Accommodation, Tourism
  • Wellbeing, Community / Social Services
Visit our privacy Policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU

But what happens next? The story of how people’s fears have been exploited by misinformation campaigns has been well documented. Some argue as Tobias Stone did in a recent essay that we are doomed to repeat the mistakes of the 1930s and a fragile balance of power misstep in say the Baltics could trigger all-out war. This might seem implausible and fall foul of Godwin’s notorious law, but Stone argues that most failed to predict cataclysmic wars or disasters as most people only have a 50-100 year historical perspective. They don’t see that it’s happening again.

Instinctively, comparisons with the 1930s feel insufficient – today’s society is radically different – and in a lecture by Cambridge’s David Runciman, he argues our understanding can be limited by historical comparisons. If democracy fails, it will be in ways that are new and surprising. It is not that institutions will collapse or political violence will re-emerge, but rather a hollowing out of democracy that we may not properly appreciate until the form no longer matches the content. Political violence is generally a young person’s game, and our ageing society doesn’t have the same triggers and appetite. He points to the dysfunctional democracy in contemporary Japan as one example of such failure that is demonstrating a disturbing divergence between democracy and prosperity.

As Ulrich Beck pointed out ten years ago, “There are a host of problems that are clearly beyond the power of the old order of nation-states to cope with. The answer to global problems that are gathering ominously all around and that refuse to yield to nation-state solutions is for politics to take a quantum leap from the nation-state system to the cosmopolitan state system.”

There are two avenues to explore. The first is technocratic – perhaps we should listen to the experts. Many major economic decisions are now accepted as technocratic ones – the independence of central banks for instance. Quangos, investigative commissions and expert regulators is our answer to agency failure. It might not be sexy, but credible governance by subject matter experts who understand the complexities of the modern economy help keep institutions honest.

The second avenue – technology – currently bears some responsibility for the content it disseminates and its impact on societies. Systems that we struggle to effectively understand or bring under political control threaten to undermine our democratic structures.

It isn’t just about getting Facebook or Twitter to accept it is a provider not just a platform, but appreciating the social power of companies with billions of users and more money than the US government. It’s understandable to worry about democracy with trillion dollar monoliths but there is a public good here – as Runciman argues, having a free support network that can share ideas and provide assistance in crises can help communities struggling with the kind of game-changing levels of unemployment and social unease that countries like Greece have suffered in recent years.

Some are banking on the robots to have the answers, not least the Japanese, who urgently need to address the challenges of an ageing society and are unwilling to increase immigration. The power of technology to help solve public policy challenges is exciting – wearable technology can help provide real-time support to someone experiencing a mental health crisis when access to human therapists is rationed by funding challenges. At the same time AI can recreate models to test the effectiveness of public policy solutions and foresee unintended consequences without the need for pilots, which can be costly and delay support. It’s cost effective too, analysis from Deloitte suggested that AI could save the US federal government as much as $41bn from its annual budget.

There is the potential for a radical change in policy development and evaluation too. One of the most important scientific breakthroughs in the 20th century was the creation of randomised controlled trials (RCTs). Medicine has gone from the fallacies of bloodletting, black bile and leeches to life-saving pharmaceutical drugs and medical technologies. RCTs have been used politically: in the 1990s Clinton’s workfare initiative was rolled out federally after individual states were given license to test employment policies, which found that mandatory work requirements were the most effective tool for tackling long term joblessness.

Other areas of public policy have been slow to keep pace, not least in education where welcome initiatives such as the testing regime of the Education Endowment Foundation have often been overlooked in false choice debates like “rigour versus creativity”. Imagine a world in which interconnected policy challenges such as climate change or drugs policy could be broken down into AI-driven experiments that would examine the old dictum of doing “what works”. There is a strong moral argument that RCTs should be considered the norm even if by their nature they may deny the control group access to a much-needed improvement. After all, if the planned improvement is counter-productive an RCT prevents not just 50 per cent of the experiment losing out, but all those afterwards whose lives are blighted by public policy designed by gut instinct. But if AI technology were able to predict human behaviour it could potentially reduce if not eliminate the need for control groups.

In a world where complexity is normal, and most pundits struggle to either explain or predict behaviours, AI could provide a more convincing narrative about how the world works than the stories that politicians tell us, predicting social behaviours to policy changes.

Of course unpredictability can’t be eliminated, but AI can also provide important insights into the political process – machine learning can use algorithms to predict which bills are most likely to pass in the US Congress for example. The Cambridge Analytica story is well known, and such risks seem inherent in a system we don’t fully understand and where many voters can lack the critical analysis to separate fact from propaganda. Of course such technology could be manipulated for good: the social media giants could invest in bots that identified misinformation campaigns and provided red flag warnings to users.

How will society respond? In the industrial revolution, the abolition of child labour, poor laws and the growth of trade unions helped families cope with the pressures of mechanised work. Clearly new ideas and institutions will need to emerge to recognise ways in which workers contribute to economic growth if their jobs are automated. In an era of smart policies, Universal Basic Income feels like a blunt instrument that wouldn’t address the potential breakdown of the social contract if large swathes of the population are effectively unemployed. Growing inequality could be matched by a creeping authoritarianism that is bolstered by technology that is increasingly able to peer into the deepest vestiges of our lives.

The 1930s analogy points to an endgame that sees a major shock to the world order – one of a similar scale to the Second World War – in order to rebuild the kind of political consensus that created the welfare state. By nature that feels unlikely – such memories are only real for the very oldest in our society, but our political myopia – limited to the last 50 or so years – may be the thing that prevents us seeing the gathering storm.

Perhaps, like the opening scene of Terminator 2, our time has come? James Lovelock, the father of modern climate science, who turns 100 this year predicts that it may not be too long before new forms of life emerge, based on AI who will take over the planet and run it instead of humans. Perhaps they’ll do a better job.

Alex Bigham is a communications consultant and generally an optimist about the future.

Content from our partners
Unlocking the potential of a national asset, St Pancras International
Time for Labour to turn the tide on children’s health
How can we deliver better rail journeys for customers?

Select and enter your email address Your weekly guide to the best writing on ideas, politics, books and culture every Saturday. The best way to sign up for The Saturday Read is via saturdayread.substack.com The New Statesman's quick and essential guide to the news and politics of the day. The best way to sign up for Morning Call is via morningcall.substack.com Our Thursday ideas newsletter, delving into philosophy, criticism, and intellectual history. The best way to sign up for The Salvo is via thesalvo.substack.com Stay up to date with NS events, subscription offers & updates. Weekly analysis of the shift to a new economy from the New Statesman's Spotlight on Policy team. The best way to sign up for The Green Transition is via spotlightonpolicy.substack.com
  • Administration / Office
  • Arts and Culture
  • Board Member
  • Business / Corporate Services
  • Client / Customer Services
  • Communications
  • Construction, Works, Engineering
  • Education, Curriculum and Teaching
  • Environment, Conservation and NRM
  • Facility / Grounds Management and Maintenance
  • Finance Management
  • Health - Medical and Nursing Management
  • HR, Training and Organisational Development
  • Information and Communications Technology
  • Information Services, Statistics, Records, Archives
  • Infrastructure Management - Transport, Utilities
  • Legal Officers and Practitioners
  • Librarians and Library Management
  • Management
  • Marketing
  • OH&S, Risk Management
  • Operations Management
  • Planning, Policy, Strategy
  • Printing, Design, Publishing, Web
  • Projects, Programs and Advisors
  • Property, Assets and Fleet Management
  • Public Relations and Media
  • Purchasing and Procurement
  • Quality Management
  • Science and Technical Research and Development
  • Security and Law Enforcement
  • Service Delivery
  • Sport and Recreation
  • Travel, Accommodation, Tourism
  • Wellbeing, Community / Social Services
Visit our privacy Policy for more information about our services, how New Statesman Media Group may use, process and share your personal data, including information on your rights in respect of your personal data and how you can unsubscribe from future marketing communications.
THANK YOU