The DCMS report on fake news is good and sensible and everyone will ignore it

Even if Labour wasn’t breaking apart, the government just doesn’t have time.

NS

Sign Up

Get the New Statesman's Morning Call email.

Sometimes you just can’t catch a break. You launch the final report of one of the biggest and most-profile select committee hearings, and pack it with blistering invective against Facebook, its CEO, and some of the key figures involved in the campaigns to leave the European Union. You time it for the Monday of the parliamentary recess, so you’ll have a clear run at the media for your bombshell recommendations.

And then seven Labour MPs – after years of procrastinating – pick that morning to quit, bumping the report down the bulletins and risking consigning months of investigation to the inside pages.

That typifies the rollercoaster run of luck that has faced one of parliament’s most sprawling investigations. The DCMS Committee’s investigation was launched in January 2017 as an inquiry into “fake news”, only to falter as it narrowly set its remit to investigate only entirely fabricated online news, which it then discovered to be primarily a US problem.

Allegations around misinformation, microtargeting, and data misuse by Cambridge Analytica and the various Brexit campaigns, then, gave the committee the opportunity for a sensible pivot to a much more real problem – even if it did put it on the UK’s most controversial political turf.

The final report, though, will probably ultimately satisfy few of the people who read it, and risks making very little real change. This is a pity as much of what it recommends is good and sensible.

The report has many tough words for Facebook, which it dubs “digital gangsters” and accuses of wilfully skirting UK data and competition laws. But the committee itself has no power to sanction the company: criticism in the report is as far as it can go.

Similarly, the report devotes time and space to criticise Cambridge Analytica for harvesting Facebook data – but like other inquiries, finds no firm evidence the firm did any Brexit-related work. And it criticises Arron Banks and Leave.EU, but is left only to hope that other investigations are carried out to see if there is malpractice there.

What is within the committee’s remit, then, is recommending changes to legislation to manage the new world of social media, microtargeting, and Russian electoral interference. Here, there are two ways of looking at things, which will overall depend on your world view.

Let’s take the bright side first. The DCMS committee report notes that the UK’s current electoral law is “not fit for purpose” for the digital age, and calls for a major overhaul requiring transparency on all political communications, legal definitions of online campaigning, and more. This is absolutely spot-on and reflects a real and pressing problem that will only worsen with each digital-era election.

The other core recommendation is for a new regulator to govern the social media platforms – treating them as something in between a neutral “platform” with no control over content, and a traditional publisher – which will enforce a statutory “code of ethics”. The regulator would have the power to launch investigations and impose fines into any alleged breaches of this code. Given almost everyone agrees something needs to be done to rein in the social giants, this seems to have potential.

Where both plans fall down first, though, is the detail: UK electoral law was already mind-numbingly complex and largely unenforceable even before digital campaigning, with hundreds of arcane tricks to allocate spending to different budgets and stay under limits. By merely recommending change – including vaguely saying there “needs to be an acknowledgement of the role and power of unpaid campaigns” – the report points to the problems but offers little in the way of actual concrete fixes.

This is even more obvious when it comes to the social media “code of ethics”: it’s easy to say there should be one, but fiendishly difficult to agree what such a legally-binding code should actually say. How should it define the limits of speech? What else should go in it? How does it avoid accidentally handing even more power over what we can and can’t say to giant overseas corporations?

The committee’s bad luck with timing reflects its larger bad luck: it has examined, sensibly, major issues affecting the online world and our democratic purpose, and come up with some reasonable beginnings – which in saner political times would then lead to more work by more politicians to get onto the more difficult bits.

But these are not sane political times. Theresa May’s government exists in name only, unable to pass any legislation on Brexit, and unable to even think about any other even slightly contentious issue aside from it. That will not end on the 29 March: whatever happens, this issue will continue to dominate politics and parliamentary time, and the government will remain too weak to take action.

The uncomfortable truth is that tackling online misinformation and protecting our democracy, after a few minutes of airtime today, will join tackling homelessness, fixing social care, rebuilding the NHS, reforming university funding, and dozens of other important issues on the back burner, with no obvious end in sight.

James Ball is the Global Editor of The Bureau of Investigative Journalism. He tweets @jamesrbuk.