New Times,
New Thinking.

  1. Science & Tech
29 January 2022

In trying to tackle fake news, Facebook is cracking down on real science

The platform’s fact checkers found no inaccuracies in a recent BMJ investigation – but limited its reach anyway. Should they be playing moral police?

By Rebecca Coombes and Madlen Davies

On 3 November, Howard Kaplan, a retired dentist from Israel, posted a British Medical Journal (BMJ) investigation to a private Facebook group. The article reported poor practices occurring at Ventavia, a research company contracted to run three trial sites for the Pfizer Covid-19 vaccine.

The article brought record traffic to bmj.com, and was widely shared on Twitter. But a week later, Kaplan woke up to a message from Facebook. “The Facebook Thought Police has issued me a dire warning,” he posted. “Facebook’s ‘independent fact-checker’ doesn’t like the wording of the article by the BMJ. And if I don’t delete my post, they are threatening to make my posts less visible… If it seems like I’ve disappeared for a while, you’ll know why.” Other BMJ readers also reported problems sharing the story.

There are now around 300 fact-checking organisations across the world, many of them fledgling companies with small budgets. Facebook in particular bestows a great deal of authority upon its 80 third-party fact checkers. But our recent experience at the BMJ, an established global publication, is that fact-checking can be incompetent, irresponsible and capable of suppressing already fully sourced and peer-reviewed journalism.

As well as warning Kaplan and others not to share the story, the BMJ investigation was given a Facebook “Missing Context” label. (These warn that “this information could mislead people”.) Readers were directed to a “fact check” article by Lead Stories, one of seven companies contracted by Facebook in the US, whose tagline is “debunking fake news as it happens”. (According to an analysis last year, Lead Stories was responsible for half of all Facebook fact checks.)

Our investigation was based on dozens of documents, provided by an experienced clinical trial auditor-turned-whistleblower, Brook Jackson. But the Lead Stories article said that none of the flaws Jackson had identified would disqualify the data collected from the main Pfizer vaccine trial. Quoting a Pfizer spokesperson, it said the drug company had reviewed an anonymous complaint and that “actions were taken to correct and remediate”: “Pfizer’s investigation did not identify any issues or concerns that would invalidate the data or jeopardize the integrity of the study,” it concluded. Lead Stories identified no factual inaccuracies in the BMJ’s article – but consistent with Facebook’s policy of combating misinformation over Covid-19, the platform reduced the article’s distribution.

When the BMJ asked Lead Stories to remove its article and the “missing context” label, pointing out errors in its own post (including describing the BMJ as a “news blog”), Lead Stories declined. Its editor, Alan Duke, told the BMJ that the “missing context” label was created by Facebook specifically “to deal with content that could mislead without additional context but which was otherwise true or real… Sometimes Facebook’s messaging about the fact checking labels can sound overly aggressive and scary. If you have an issue with their messaging you should indeed take it up with them as we are unable to change any of it.” Duke added that the article was being shared by anti-vaccine activists, even though it did not suggest that the overall findings of the Pfizer trial had been skewed.

The BMJ wrote an open letter to the Meta CEO Mark Zuckerberg, appealing the rating. Zuckerberg didn’t respond, but Lead Stories did. (It is an irony not lost on us that Nick Clegg, head of global affairs and communications at Meta, Facebook’s parent company, is the grandson of Hugh Clegg, editor-in-chief of the BMJ from 1946 to 1965.) In a new blog post, the fact checkers cast doubt on our whistleblower’s credibility, saying that Jackson was not a “lab-coated scientist” and that her qualifications amounted to a “30-hour certification in auditing techniques”. (Jackson has more than 15 years’ experience in clinical research coordination and management.) Lead Stories added that Jackson did not “express unreserved support for COVID vaccines”, pointing to two tweets posted after the BMJ’s investigation. One criticised a Sesame Street episode in which Big Bird gets a Covid vaccine; another approved a US court ruling against mandating vaccination for federal employees.

Give a gift subscription to the New Statesman this Christmas from just £49

Should Facebook compel anyone to express unreserved support on any one issue? Gary Schwitzer, adjunct associate professor of public health at the University of Minnesota and publisher of HealthNewsReview, which grades US news organisations’ health reporting, told us: “It’s absolutely immaterial to the topic at hand. For it to be in this independent review, I think says more about the reviewer than the reviewee.” Schwitzer told the BMJ the processes by which Facebook decides what gets fact-checked, and the contractors’ systems for deciding which pieces they review are not transparent or consistent enough.

[See also: It’s the tech giants, not socialist politicians, who are coming for our liberty]

On 20 December, Lead Stories posted a series of inflammatory tweets criticising the BMJ and Paul Thacker, the author of the investigation. One said: “Hey @bmj_latest, when your articles are literally being republished by a website run by someone in the “Disinformation Dozen” perhaps you should be reviewing your editorial policies instead of writing open letters.” (The article had been republished outside the BMJ’s licence terms on an anti-vax site; the journal asked the site to take it down.)

When the BMJ appealed directly to Facebook, it was told: “Fact checkers are responsible for reviewing content and applying ratings, and this process is independent from Meta.” Meta’s Oversight Board, which last year removed Donald Trump from the platform, refused to consider the BMJ’s appeal, saying that “fact checking labels aren’t something the Board’s decisions currently cover”.

So, in the absence of anything else, it is left to users to debate the fact checkers. Jillian York, director for international freedom of expression at the Electronic Frontier Foundation, told the BMJ: “I worry about the amount of power placed in the hands of these third-party groups… While I do see a role for fact-checking and think it’s far superior to the alternative – which is Facebook just taking down content – I still worry about the effect that it can have on legitimate sources.”

Allowing fact checkers to take an editorial position is a concern for journalism more widely. “Companies like Facebook and some of the traditional media establishments are reasonably concerned about vaccine misinformation,” York said. “But they have swung so far in the opposite direction as to potentially shut down legitimate questions about major corporations like Pfizer.” The medical industry has a history of suppressing certain information, she added; it is important that journalists can question it. The BMJ is making a final appeal to Meta’s Oversight Board, but for now its investigation remains obscured on Facebook.

The unanswered question, according to the BMJ’s editor-in-chief Kamran Abbasi, is: why is Facebook doing this? “What is driving its world view? Is it ideology? Is it commercial interests? Is it incompetence?” And in the future, who can readers trust: journals like the BMJ, a reputable publisher since 1840, staffed by doctors, journalists, technical editors and statisticians – or Facebook and its fact checkers?

Rebecca Coombes is head of journalism at the “BMJ”. Madlen Davies is the journal’s investigations editor.

Content from our partners
"Time to bring housebuilding into the 21st century"
For building best practice? Look North
Where does the Budget leave housebuilding?

Topics in this article : , ,