The Misinformation Effect and the Psychology Behind Fake News

Helen Lee Bouygues

Since the 2016 election, there has been a great deal of talk about fake news, or misinformation, and the impact it continues to have on elections and public discourse around the world. The Reboot Foundation recently released a report on this topic, outlining the nature of the misinformation crisis and offering several suggestions for addressing it.

As non-profit organizations, governments, and citizens move forward in addressing these issues, it’s important that interventions be based on a good understanding of how misinformation works and why it is effective. 

A better grasp of the “misinformation effect” and other related psychological phenomena, such as the “tainted truth effect” can help policymakers and the public better devise and implement solutions to the fake news problem.

Psychologists and cognitive scientists have been studying for many years what’s called the “misinformation effect:” the way false or misleading information, received by subjects after they’ve received correct information, can distort their understanding. A better grasp of this and other related psychological phenomena, such as the “tainted truth effect” can help policymakers and the public better devise and implement solutions to the fake news problem. 

In this column, I’ll review the psychology behind fake news and the misinformation effect, and discuss what it means for the project of improving critical thinking and our public discourse in general.

What Is the Misinformation Effect?

The misinformation effect was first studied in the 1970s by psychologist and memory expert Elizabeth Loftus. Her research has demonstrated that memory is far easier to influence than might ordinarily be thought.

In one early experiment on the misinformation effect, she and her research team showed participants slides of a car accident, and then later had the participants read inaccurate or misleading information about the accident. The experiment showed that participants easily assimilated this flawed information, making mistakes when later asked what had happened in the accident. 

Even the simple phrasing of a question about the past can influence our memories. In another early experiment, researchers showed participants a video of a minor car accident and later asked them to report on the accident. They found that simply asking subjects a question phrased suggestively — “about how fast were the cars going when they smashed into each other?” — could influence their memories of how serious the accident was.

The misinformation effect is especially worrisome as technology improves and bad actors become able to create convincing fake videos and images easily and cheaply.

More recent research by Loftus and others has studied this effect in the context of political news. In a study of subjects about to vote on the 2018 abortion referendum in Ireland, the researchers found it very easy to implant false memories of past events in participants through fake news stories. This was especially true when the fake stories aligned with participants’ political leanings on the topic.

The misinformation effect is especially worrisome as technology improves and bad actors become able to create convincing fake videos and images easily and cheaply.

The Tainted Truth Effect

Other psychological phenomena can make the problem of misremembering worse. Recent research has focused on the “tainted truth effect,” which is closely related to the misinformation effect. The tainted truth effect refers to the way warnings of misinformation — both well-intentioned, like fact-checking, and ill-intentioned — can actually make people less trusting of legitimate news and information.

Some warnings are, of course, useful. The best fact-checkers identify and debunk fake news stories, point out politicians’ lies or misstatements, and inform the public about context and history.

Research suggests that these warnings are most effective when they are specific. The problem is when the warnings become broad. Then they foster a type of general disbelief in all media. One recent study found that even well-intentioned general warnings about misinformation can be damaging, leading to “decreased belief in the accuracy of true headlines.”

In other words, swirling misinformation warnings can contribute to the deterioration of public discourse. Another recent study involved participants watching a CSPAN video on a contemporary issue. The participants were then exposed in groups to a variety of different content, including accurate information about what they’d seen, misinformation, and misinformation warnings.

The research team found, unsurprisingly, that reliable information could boost participants’ later ability to recall the events, and misinformation could significantly damage their memory.

Social media networks almost seem designed to accelerate the spread of falsehoods and propaganda.

They found that valid misinformation warnings led individuals to reject misinformation, but their recall of the facts was still impaired when compared to a control condition. In other words, fake news has an impact even when it is recognized as fake.

Finally, the authors found that ill-intentioned misinformation warnings largely have their intended effect, muddying the waters and reducing trust even in reliable journalism. They “cause people to reject accurate information that is associated with the tainted source.”

How to Fight Misinformation

The research into the misinformation effect and related phenomena shows how psychologically susceptible we are to fake news, false memories, and entrenched cognitive biases. It also shows how far-reaching the effects of misinformation can be on public discourse, especially when the misinformation effect is compounded by network effects. Social media networks almost seem designed to accelerate the spread of falsehoods and propaganda.

In fighting the misinformation effect, it is not enough to merely correct or remove the misinformation. We need to address the damage done to public discourse more generally. So what exactly is to be done?

Education Matters. The best remedy to misinformation is education. The only long-term and sustainable solution is a general population better attuned to the dangers of misinformation.

But media literacy is not happening in the nation’s education system. Reboot’s work shows that media literacy still isn’t being emphasized nearly enough in primary and secondary school, despite the fact that these courses can help students become more mindful of the ways they can be influenced and deceived. Media literacy education can also equip students  with the robust critical thinking and research skills they need to navigate a complex and often deceptive information environment.

Education for the general population is also needed. As the studies discussed above show, specific misinformation warnings can have some positive effect, even if general blanket warnings or invalid warnings can be harmful. Similarly, in Reboot’s own research on fake news, even quick interventions like a short article or video can help people better identify misinformation.

The real answer, then, is instilling people with these habits of active engagement so that they will process information in a more active and engaged manner on their own.

Other quick educational interventions have been shown to be useful as well. Some studies have shown, for example, that the misinformation effect can be reduced by quizzing participants on what they’ve learned prior to their exposure to the misinformation. In other words, if asked to recall information immediately after acquiring it, people are more likely to retain it, even in the face of later misinformation.

Ask Questions. The results of studies into the psychology of misinformation suggest that a large part of the problem is the passivity of news consumers, especially online. The more active and engaged readers and viewers are, the better will be their grasp of the material they’re exposed to.

Of course, quick tests built into news articles and videos are an interesting idea, they would likely be ignored by most of the public. The real answer, then, is instilling people with these habits of active engagement so that they will process information in a more active and engaged manner on their own. This requires a commitment to teaching advanced critical thinking and media literacy skills across the population.

Be Transparent. Finally, the media organizations, tech companies, and governments that are responsible for informing the public about information and misinformation must make sure they don’t become part of the problem.

For social media platforms, like Facebook, this means clear and transparent policies about what information is allowed to appear and what should be removed. It also means making specific and accurate information countering misinformation available, rather than blanket warnings that may reduce trust in all media.

For governments, transparency means making sure efforts to combat misinformation don’t become an excuse for violating or seeming to violate free speech principles. And for news sources, transparency means clear and accurate information that combats misinformation while avoiding unwarranted panic about it.

If we fight misinformation on all these levels: in schools, in online habits, and in public institutions, the end result will be a population less susceptible to falsehoods, more secure in its knowledge, and with more confidence to make informed decisions.

Helen Lee Bouygues is the president of the Reboot Foundation.