Disinformation threatens global elections
How to fight back
The World Economic Forum declared misinformation a top societal threat over the next two years. Three academics unpack the issue: Sander van der Linden, professor of social psychology in society at the University of Cambridge (UK), Lee McIntyre, research fellow at the Centre for Philosophy and History of Science at Boston University (US), and Stephan Lewandowsky, chair of cognitive psychology at University of Bristol (UK).
With over half the world’s population heading to the polls in 2024, disinformation season is upon us — and the warnings are dire.
The World Economic Forum (WEF) declared misinformation a top societal threat over the next two years and major news organisations caution that disinformation poses an unprecedented threat to democracies worldwide.
Yet, some scholars and pundits have questioned whether disinformation can sway election outcomes.
Others think concern over disinformation is just a moral panic or merely a symptom rather than the cause of societal ills. Pollster Nate Silver even thinks that misinformation “isn’t a coherent concept”.
But we argue the evidence tells a different story.
A 2023 study showed that the vast majority of academic experts are in agreement about how to define misinformation (namely as false and misleading content) and what this looks like (for example lies, conspiracy theories and pseudoscience).
Although the study didn’t cover disinformation, such experts generally agree that this can be defined as intentional misinformation.
Symptom and diseaseA recent paper clarified that misinformation can both be a symptom and a disease.
In 2022, nearly 70% of Republicans still endorsed the false conspiracy theory that the 2020 US presidential election was “stolen” from Donald Trump. If Trump had never floated this theory, how would millions of people have possibly acquired these beliefs?
Moreover, although it is clear that people do not always act on dangerous beliefs, the January 6 US Capitol riots, incited by false claims, serve as an important reminder that a misinformed crowd can disrupt and undermine democracy.
Given that nearly 25% of elections are decided by a margin of under 3%, mis- and disinformation can have an important influence.
One study found that among previous Barack Obama voters who did not buy into any fake news about Hillary Clinton during the 2016 presidential election, 89% voted for Clinton. By contrast, among prior Obama voters who believed at least two fake headlines about Clinton, only 17% voted for her.
While this doesn’t necessarily prove that the misinformation caused the voting behaviour, we do know that millions of black voters were targeted with misleading ads discrediting Clinton in key swing states ahead of the election.
Micro-targeting
Research has shown that such micro-targeting of specific audiences based on variables such as their personality not only influences decision-making but also impacts voting intentions.
A recent paper illustrated how large language models can be deployed to craft micro-targeted ads at scale, estimating that for every 100 000 individuals targeted, at least several thousand can be persuaded.
We also know that not only are people bad at discerning deepfakes (AI generated images of fake events) from genuine content, studies find that deepfakes do influence political attitudes among a small target group.
There are more indirect consequences of disinformation too, such as eroding public trust and participation in elections.
Other than hiding under our beds and worrying, what can we do to protect ourselves?
Power of prebunking
Many efforts have focused on fact-checking and debunking false beliefs.
In contrast, “prebunking” is a new way to prevent false beliefs from forming in the first place. Such “inoculation” involves warning people not to fall for a false narrative or propaganda tactic, together with an explanation as to why.
Misinforming rhetoric has clear markers, such as scapegoating or use of false dichotomies (there are many others), that people can learn to identify. Like a medical vaccine, the prebunk exposes the recipient to a “weakened dose” of the infectious agent (the disinformation) and refutes it in a way that confers protection.
For example, we created an online game for the Department of Homeland Security to empower Americans to spot foreign influence techniques during the 2020 presidential election. The weakened dose? Pineapple pizza.
How could pineapple pizza possibly be the way to tackle misinformation?
It shows how bad-faith actors can take an innocuous issue such as whether or not to put pineapple on pizza, and use this to try to start a culture war.
They might claim it’s offensive to Italians or urge Americans not to let anybody restrict their pizza-topping freedom.
They can then buy bots to amplify the issue on both sides, disrupt debate – and sow chaos. Our results showed that people improved in their ability to recognise these tactics after playing our inoculation game.
Inoculation
In 2020, Twitter identified false election tropes as potential “vectors of misinformation” and sent out prebunks to millions of US users warning them of fraudulent claims, such as that voting by mail is not safe.
These prebunks armed people with a fact — that experts agree that voting by mail is reliable — and it worked insofar as the prebunks inspired confidence in the election process and motivated users to seek out more factual information. Other social media companies, such as Google and Meta have followed suit across a range of issues.
A new paper tested inoculation against false claims about the election process in the US and Brazil.
Not only did it find that prebunking worked better than traditional debunking, but that the inoculation improved discernment between true and false claims, effectively reduced election fraud beliefs and improved confidence in the integrity of the upcoming 2024 elections.
In short, inoculation is a free speech-empowering intervention that can work on a global scale.
When Russia was looking for a pretext to invade Ukraine, US president Joe Biden used this approach to “inoculate” the world against Putin’s plan to stage and film a fabricated Ukrainian atrocity, complete with actors, a script and a movie crew. Biden declassified the intelligence and exposed the plot.
In effect, he warned the world not to fall for fake videos with actors pretending to be Ukrainian soldiers on Russian soil. Forewarned, the international community was unlikely to fall for it. Russia found another pretext to invade, of course, but the point remains: forewarned is forearmed.
Spot misinformation
But we need not rely on government or tech firms to build mental immunity. We can all learn how to spot misinformation by studying the markers accompanying misleading rhetoric.
Remember that polio was a highly infectious disease that was eradicated through vaccination and herd immunity. Our challenge now is to build herd immunity to the tricks of disinformers and propagandists.
The future of our democracy may depend on it.
– The Conversation
The World Economic Forum (WEF) declared misinformation a top societal threat over the next two years and major news organisations caution that disinformation poses an unprecedented threat to democracies worldwide.
Yet, some scholars and pundits have questioned whether disinformation can sway election outcomes.
Others think concern over disinformation is just a moral panic or merely a symptom rather than the cause of societal ills. Pollster Nate Silver even thinks that misinformation “isn’t a coherent concept”.
But we argue the evidence tells a different story.
A 2023 study showed that the vast majority of academic experts are in agreement about how to define misinformation (namely as false and misleading content) and what this looks like (for example lies, conspiracy theories and pseudoscience).
Although the study didn’t cover disinformation, such experts generally agree that this can be defined as intentional misinformation.
Symptom and diseaseA recent paper clarified that misinformation can both be a symptom and a disease.
In 2022, nearly 70% of Republicans still endorsed the false conspiracy theory that the 2020 US presidential election was “stolen” from Donald Trump. If Trump had never floated this theory, how would millions of people have possibly acquired these beliefs?
Moreover, although it is clear that people do not always act on dangerous beliefs, the January 6 US Capitol riots, incited by false claims, serve as an important reminder that a misinformed crowd can disrupt and undermine democracy.
Given that nearly 25% of elections are decided by a margin of under 3%, mis- and disinformation can have an important influence.
One study found that among previous Barack Obama voters who did not buy into any fake news about Hillary Clinton during the 2016 presidential election, 89% voted for Clinton. By contrast, among prior Obama voters who believed at least two fake headlines about Clinton, only 17% voted for her.
While this doesn’t necessarily prove that the misinformation caused the voting behaviour, we do know that millions of black voters were targeted with misleading ads discrediting Clinton in key swing states ahead of the election.
Micro-targeting
Research has shown that such micro-targeting of specific audiences based on variables such as their personality not only influences decision-making but also impacts voting intentions.
A recent paper illustrated how large language models can be deployed to craft micro-targeted ads at scale, estimating that for every 100 000 individuals targeted, at least several thousand can be persuaded.
We also know that not only are people bad at discerning deepfakes (AI generated images of fake events) from genuine content, studies find that deepfakes do influence political attitudes among a small target group.
There are more indirect consequences of disinformation too, such as eroding public trust and participation in elections.
Other than hiding under our beds and worrying, what can we do to protect ourselves?
Power of prebunking
Many efforts have focused on fact-checking and debunking false beliefs.
In contrast, “prebunking” is a new way to prevent false beliefs from forming in the first place. Such “inoculation” involves warning people not to fall for a false narrative or propaganda tactic, together with an explanation as to why.
Misinforming rhetoric has clear markers, such as scapegoating or use of false dichotomies (there are many others), that people can learn to identify. Like a medical vaccine, the prebunk exposes the recipient to a “weakened dose” of the infectious agent (the disinformation) and refutes it in a way that confers protection.
For example, we created an online game for the Department of Homeland Security to empower Americans to spot foreign influence techniques during the 2020 presidential election. The weakened dose? Pineapple pizza.
How could pineapple pizza possibly be the way to tackle misinformation?
It shows how bad-faith actors can take an innocuous issue such as whether or not to put pineapple on pizza, and use this to try to start a culture war.
They might claim it’s offensive to Italians or urge Americans not to let anybody restrict their pizza-topping freedom.
They can then buy bots to amplify the issue on both sides, disrupt debate – and sow chaos. Our results showed that people improved in their ability to recognise these tactics after playing our inoculation game.
Inoculation
In 2020, Twitter identified false election tropes as potential “vectors of misinformation” and sent out prebunks to millions of US users warning them of fraudulent claims, such as that voting by mail is not safe.
These prebunks armed people with a fact — that experts agree that voting by mail is reliable — and it worked insofar as the prebunks inspired confidence in the election process and motivated users to seek out more factual information. Other social media companies, such as Google and Meta have followed suit across a range of issues.
A new paper tested inoculation against false claims about the election process in the US and Brazil.
Not only did it find that prebunking worked better than traditional debunking, but that the inoculation improved discernment between true and false claims, effectively reduced election fraud beliefs and improved confidence in the integrity of the upcoming 2024 elections.
In short, inoculation is a free speech-empowering intervention that can work on a global scale.
When Russia was looking for a pretext to invade Ukraine, US president Joe Biden used this approach to “inoculate” the world against Putin’s plan to stage and film a fabricated Ukrainian atrocity, complete with actors, a script and a movie crew. Biden declassified the intelligence and exposed the plot.
In effect, he warned the world not to fall for fake videos with actors pretending to be Ukrainian soldiers on Russian soil. Forewarned, the international community was unlikely to fall for it. Russia found another pretext to invade, of course, but the point remains: forewarned is forearmed.
Spot misinformation
But we need not rely on government or tech firms to build mental immunity. We can all learn how to spot misinformation by studying the markers accompanying misleading rhetoric.
Remember that polio was a highly infectious disease that was eradicated through vaccination and herd immunity. Our challenge now is to build herd immunity to the tricks of disinformers and propagandists.
The future of our democracy may depend on it.
– The Conversation
Kommentar
Allgemeine Zeitung
Zu diesem Artikel wurden keine Kommentare hinterlassen