Man in suit holding mobile

Image courtesy of Getty Images.

COVID misinformation is a health risk – tech companies need to remove harmful content not tweak their algorithms

Many worldwide have now caught COVID. But during the pandemic many more are likely to have encountered something else that’s been spreading virally: misinformation. False information has plagued the COVID response, erroneously convincing people that the virus isn’t harmful, of the merits of various ineffective treatments, or of false dangers associated with vaccines.

The Conversation's logo

This article was published by the Conversation.

Often, this misinformation spreads through social media. At its worst, it can kill people. The UK’s Royal Society, noting the scale of the problem, has made online information the subject of its latest report. This puts forward arguments for how to limit misinformation’s harms.

The report is an ambitious statement, covering everything from deepfake videos to conspiracy theories about water fluoridation. But its key coverage is of the COVID pandemic and – rightly – the question of how to tackle misinformation about COVID and vaccines.

Here, it makes some important recommendations. These include the need to better support factcheckers, to devote greater attention to the sharing of misinformation on private messaging platforms such as WhatsApp, and to encourage new approaches to online media literacy.

But the main recommendation – that social media companies shouldn’t be required to remove content that is legal but harmful, but be asked to tweak their algorithms to prevent the viral spread of misinformation – is too limited. It is also ill suited to public health communication about COVID. There’s good evidence that exposure to vaccine misinformation undermines the pandemic response, making people less likely to get jabbed and more likely to discourage others from being vaccinated, costing lives.

The basic – some would say insurmountable – problem with this recommendation is that that it will make public health communication dependent on the good will and cooperation of profit-seeking companies. These businesses are poorly motivated to open up their data and processes, despite being crucial infrastructures of communication. Google search, YouTube and Meta (now the umbrella for Facebook, Facebook Messenger, Instagram and WhatsApp) have substantial market dominance in the UK. This is real power, despite these companies’ claims that they are merely “platforms”.

These companies’ business models depend heavily on direct control over the design and deployment of their own algorithms (the processes their platforms use to determine what content each user sees). This is because these algorithms are essential for harvesting mass behavioural data from users and selling access to that data to advertisers.

This fact creates problems for any regulator wanting to devise an effective regime for holding these companies to account. Who or what will be responsible for assessing how, or even if, their algorithms are prioritising and deprioritising content in such a way as to mitigate the spread of misinformation? Will this be left to the social media companies themselves? If not, how will this work? The companies’ algorithms are closely guarded commercial secrets. It is unlikely they will want to open them up to scrutiny by regulators.

Recent initiatives, such as Facebook’s hiring of factcheckers to identify and moderate misinformation on its platform, have not involved opening up algorithms. That has been off limits. As leading independent factchecker Full Fact has said: “Most internet companies are trying to use [artificial intelligence] to scale fact checking and none is doing so in a transparent way with independent assessment. This is a growing concern.”

Plus, tweaking algorithms will have no direct impact on misinformation circulating on private social media apps such as WhatsApp. The end-to-end encryption on these wildly popular services means shared news and information is beyond the reach of all automated methods of sorting content.

A better way forward

Requiring social media companies to instead remove harmful scientific misinformation would be a better solution than algorithmic tweaking.The key advantages are clarity and accountability.

Regulatorscivil society groups and factcheckers can identify and measure the prevalence of misinformation, as they have done so far during the pandemic, despite constraints on access. They can then ask social media companies to remove harmful misinformation at the source, before it spreads across the platform and drifts out of public view on WhatsApp. They can show the world what the harmful content is and make a case for why it ought to be removed.

Article continues...

For the full article by Professor Andrew Chadwick, an expert in political communication, visit the Conversation webpage here

Notes for editors

Press release reference number: 22/09

Loughborough is one of the country’s leading universities, with an international reputation for research that matters, excellence in teaching, strong links with industry, and unrivalled achievement in sport and its underpinning academic disciplines.

It has been awarded five stars in the independent QS Stars university rating scheme, named the best university in the world for sports-related subjects in the 2021 QS World University Rankings and University of the Year for Sport by The Times and Sunday Times University Guide 2022.

Loughborough is in the top 10 of every national league table, being ranked 7th in The UK Complete University Guide 2022, and 10th in both the Guardian University League Table 2022 and the Times and Sunday Times Good University Guide 2022.

Loughborough is consistently ranked in the top twenty of UK universities in the Times Higher Education’s ‘table of tables’ and is in the top 10 in England for research intensity. In recognition of its contribution to the sector, Loughborough has been awarded seven Queen's Anniversary Prizes.

The Loughborough University London campus is based on the Queen Elizabeth Olympic Park and offers postgraduate and executive-level education, as well as research and enterprise opportunities. It is home to influential thought leaders, pioneering researchers and creative innovators who provide students with the highest quality of teaching and the very latest in modern thinking.