Catalog: The Psychology of Misinformation
How misinformation hacks human brains, why it’s hard to correct, and what seems to work.
This page is one part of the Prism Anti-Misinformation Resources Catalog. See the Table of Contents to navigate to other categories of resources.
Misinformation as Brain Hacking
The psychology of misinformation: Why we’re vulnerable (First Draft)
Overview of major psychological concepts related to misinformation, its correction, and prevention. Intended as a starting point rather than the last word; use the suggested further reading to dive deeper. Cognitive miserliness, dual process theory, heuristics, cognitive dissonance, confirmation bias, motivated reasoning, pluralistic ignorance, the third-person effect, fluency, and bullshit receptivity.
The Science of Misinformation: Why we believe false information (Dr. Lisa Fazio via Vanderbilt University)
This college-level curriculum focuses on questions such as, How do people come to believe false information? How can we best correct false beliefs? and What policies/interventions are more or less effective in stopping the spread of misinformation?
Why People Fall For Conspiracy Theories (FiveThirtyEight)
Explores cognitive fallacies and how different people reach conclusions based on different amounts of evidence.
Why Do People Believe Liars? (Ruth Ben-Ghiat via Substack)
Propaganda gains traction through repetition and saturation. The same message is disseminated through multiple channels and institutions, with small variations, to lead a maximum number of individuals “in the same direction, but differently,” as the sociologist Jacques Ellul wrote.
How Donald Trump, Elon Musk, and Gwyneth Paltrow Short-Circuit Your Ability to Think Rationally (Eric Roston via Bloomberg)
The sketchy rhetorical tricks of politicians, celebs, and con men—and how they work.
Conspiracy Theorists Have a Fundamental Cognitive Problem, Say Scientists (pocket)
The world’s a scary, unpredictable place, and that makes your brain mad. As a predictive organ, the brain is on the constant lookout for patterns that both explain the world and help you thrive in it. But sometimes people sense danger even when there is no pattern to recognize — and so their brains create their own. This phenomenon, called illusory pattern perception is what drives people who believe in conspiracy theories, like climate change deniers, 9/11 truthers, and “Pizzagate” believers.
Study: 3 in 4 Americans overestimate their ability to spot fake news (CNN)
CNN's Oliver Darcy discusses a University of Utah and Washington University study that shows 3 in 4 Americans overestimate their ability to discern between real and fake news.
Avoiding the Rabbit Hole: Teaching Concepts in Conspiratorial Thinking (News Literacy Project)
This recorded edWebinar explores the psychological and cognitive factors behind conspiratorial thinking, including the role of fears and anxiety, cognitive dissonance and biases, motivated reasoning and institutional cynicism. Hear the ways in which conspiracy theories exploit our emotions as well as fill our emotional needs. As part of the presentation, outline essential learning objectives and concepts and provide instructional resources for integrating these concepts into the curriculum.
Who falls for fake news? The roles of bullshit receptivity, overclaiming, familiarity, and analytic thinking (Gordon Pennycook and David G. Rand)
The tendency to ascribe profundity to randomly generated sentences—pseudo-profound bullshit receptivity—correlates positively with perceptions of fake news accuracy, and negatively with the ability to differentiate between fake and real news (media truth discernment). Relatedly, individuals who overclaim their level of knowledge also judge fake news to be more accurate.
Reliance on emotion promotes belief in fake news (Cameron Martel, Gordon Pennycook, and David Rand)
Across a wide range of specific emotions, heightened emotionality was predictive of increased belief in fake (but not real) news. Self-reported use of emotion was positively associated with belief in fake (but not real) news, and inducing reliance on emotion resulted in greater belief in fake (but not real) news stories compared to a control or to inducing reliance on reason.
_______________
Political Beliefs and Susceptibility to Misinformation
Political Misinformation and Conspiracy Theories (Professor Brendan Nyhan via Dartmouth University)
This college-level curriculum explores the psychological factors that make people vulnerable to misperceptions and conspiracy theories and the reasons that corrections so often fail to reduce the prevalence of these phenomena. We will also analyze how those tendencies are exploited by political elites and consider possible approaches that journalists, civic reformers, government officials, and technology platforms could employ to combat misperceptions.
Does political extremity harm the ability to identify online information validity? Testing the impact of polarisation through online experiments (Cheuk Hang Au, Kevin K.W.Ho, and Dickson K.W. Chiu
Fact-checking alone may not help readers to build misinformation identification capabilities. Source-independent measures are more effective for readers to identify misinformation. Anti-patriotism political extremists may be more capable of identifying misinformation.
Ideological Asymmetries and the Determinants of Politically Motivated Reasoning (Brian Guay and Christopher D. Johnston via American Journal of Political Science)
We find little evidence for the hypothesis that conservatives are less open to new information that conflicts with their political identity. We also find no evidence that epistemic needs promote politically motivated reasoning; although conservatives report greater needs for certainty than liberals, these needs are not a major source of political bias.
_______________
Cognitive Barriers to Correcting Misinformation
The psychology of misinformation: Why it’s so hard to correct (First Draft)
The psychological concepts that are relevant to corrections, such as fact checks and debunks. One key theme that will resurface is the central problem of correction: Once we’re exposed to misinformation, it’s very hard to get it out of our heads. The continued influence effect, mental models, the implied truth effect, the tainted truth effect, repetition, the illusory truth effect, the backfire effect, the overkill backfire effect, the worldview backfire effect, and the familiarity backfire effect.
Misinformation and its Correction: Cognitive Mechanisms and Recommendations
for Mass Communication (Briony Swire and Ullrich Ecker, University of Western Australia)
An overview of the psychology of correcting misinformation, and a presentation of six recommendations for doing so more effectively: Provide factual alternatives; Boost retrieval of the retraction, not familiarity of the myth; Use refutations of misinformation as an educational tool; Build credibility; Provide worldview- or self-affirming corrections; and Foster skepticism.
The psychological drivers of misinformation belief and its resistance to correction (Ullrich K. H. Ecker, Stephan Lewandowsky, and Michelle A. Amazeen via Nature Reviews Psychology)
We discuss the effectiveness of both pre-emptive (‘prebunking’) and reactive (‘debunking’) interventions to reduce the effects of misinformation, as well as implications for information consumers and practitioners in various areas including journalism, public health, policymaking and education.
Misinformation and Its Correction: Continued Influence and Successful Debiasing (Stephan Lewandowsky, Ullrich K. H. Ecker, Colleen M. Seifert, and others)
Simply retracting a piece of information will not stop its influence. Only three factors have been identified that can increase the effectiveness of retractions: (a) warnings at the time of the initial exposure to misinformation, (b) repetition of the retraction, and (c) corrections that tell an alternative story that fills the coherence gap otherwise left by the retraction.
Explicit warnings reduce but do not eliminate the continued influence of misinformation (Ullrich K. H. Ecker, Stephan Lewandowsky, and David T. W. Tang)
A specific warning—giving detailed information about the continued influence effect—succeeded in reducing the continued reliance on outdated information but did not eliminate it. A more general warning—reminding people that facts are not always properly checked before information is disseminated—was even less effective.
The Implied Truth Effect: Attaching Warnings to a Subset of Fake News Headlines Increases Perceived Accuracy of Headlines Without Warnings (Gordon Pennycook , Adam Bear, Evan T. Collins, and David G. Rand)
When false information warnings are in use on a social media platform and a false headline is not tagged with one, its assertion can be perceived as validated and thus more accurate.
Warning against warnings: Alerted subjects may perform worse. Misinformation, involvement and warning as determinants of witness testimony (Malwina Szpitalak and Romuald Polczyk)
The aim of the research was to replicate the tainted truth effect, consisting in poor memory functioning of non-misled warned subjects and to check whether a subject’s involvement in the issue moderates this effect. Highly involved subjects were more resistant to the misinformation effect than those lowly involved.
Prior exposure increases perceived accuracy of fake news (Gordon Pennycook, Tyrone D. Cannon, and David G. Rand)
Even a single exposure increases subsequent perceptions of accuracy. This “illusory truth effect” for fake news headlines occurs despite a low level of overall believability, and even when the stories are labeled as contested by
fact checkers or are inconsistent with the reader’s political ideology. These results suggest that social media platforms help to incubate belief in blatantly false news stories, and that tagging such stories as disputed is not an effective solution to this problem. Prior exposure does not impact entirely implausible statements, but only a small degree of potential plausibility is sufficient for repetition to increase perceived accuracy.
The backfire effect: Does it exist? And does it matter for factcheckers? (Full Fact)
From seven major experimental studies examining it, the “backfire effect” (when a “debunking”/factual counterclaim reinforces a person’s ideological beliefs) has been found to be rare rather than the norm. Generally, debunking can make people’s beliefs in specific claims more accurate.
The role of familiarity in correcting inaccurate information (Swire, B., Ecker, U. K. H. , & Lewandowsky, S.)
Corrected misinformation often continues to influence memory and reasoning—the continued influence effect. More explanatory detail in factual affirmations and myth retractions promotes more sustained belief change. Fact affirmations promoted more sustained belief change in comparison to myth retractions over the course of one weekk (but not over three weeks), particularly for older adults.
_______________
Best Practices
The psychology of misinformation: How to prevent it (First Draft)
The psychological concepts that can help us by building our mental (and therefore social) resilience. What you’ll find is that many of the resources we need to slow down misinformation are right there in our brains, waiting to be used. Skepticism, emotional skepticism, alertness, analytic thinking, friction, inoculation, and nudges.
Fake news, fast and slow: Deliberation reduces belief in false (but not true) news headlines (Bence Bago, David Rand, and Gordon Pennycook)
Deliberation can correct intuitive mistakes: subjects who believed false headlines (but not true headlines) more in initial responses believed them less when they had more time to deliberate. In the context of fake news, deliberation facilitates accurate belief formation and not partisan bias.
Pausing to consider why a headline is true or false can help reduce the sharing of false news (HKS Misinformation Review)
In an online experiment, participants who paused to explain why a headline was true or false indicated that they were less likely to share false information compared to control participants. Their intention to share accurate news stories was unchanged. These results indicate that adding “friction” (i.e., pausing to think) before sharing can improve the quality of information shared on social media.
Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence (John Cook, Stephan Lewandowsky, and Ullrich K. H. Ecker)
False-balance media coverage about climate change lowers perceived consensus overall, and misinformation that confuses people about the level of scientific agreement regarding anthropogenic global warming (AGW) has a polarizing effect. However, inoculating messages that (1) explain the flawed argumentation technique used in the misinformation or that (2)
highlight the scientific consensus on climate change were effective in neutralizing those adverse effects of misinformation.
Fighting COVID-19 misinformation on social media: Experimental evidence for a scalable accuracy nudge intervention (Gordon Pennycook, Jonathon McPhetres, Yunhao Zhang, Jackson G. Lu, and David Rand)
People sometimes share false claims about COVID-19 partly because they simply fail to think sufficiently about whether or not content is accurate when deciding what to share. People are far worse at discerning between true and false content when deciding what they would share on social media relative to when they are asked directly about accuracy. A simple accuracy reminder before being presented with content can significantly improve individuals’ level of truth discernment in sharing intentions.
Lateral reading: College students learn to critically evaluate internet sources in an online course (Harvard Kennedy School Misinformation Review)
In an asynchronous college nutrition course at a large state university, we embedded modules that taught students how to vet websites using fact checkers’ strategies. Chief among these strategies was lateral reading, the act of leaving an unknown website to consult other sources to evaluate the original site. Students improved significantly from pretest to posttest, engaging in lateral reading more often post intervention.
Can “Googling” correct misbelief? Cognitive and affective consequences of online search (Tetsuro Kobayashi, Fumiaki Taka, and Takahisa Suzuki via PLoS ONE)
Online search reduces on average the likelihood of believing misinformation, the magnitude of the effect is larger among those who are predisposed to believe the misinformation, cognitive correction is observed whether searchers are motivated to achieve a directional goal or an accuracy goal, and online search deteriorates affective feeling toward the target groups of the misinformation.
________________
Misinformation Psychology in Society
Social Exchange of Motivated Beliefs (Ryan Oprea and Sevgi Yuksel via Journal of the European Economic Association)
Subjects respond to one another’s beliefs in a highly asymmetric way, causing a severe amplification of subjects’ initial bias. No such patterns are found in response to objective public signals or in control treatments without social exchange or scope for motivated beliefs. The pattern observed is difficult to reconcile with Bayesianism and standard versions of confirmation bias. Overall, bias amplification is likely driven by “motivated assignment of accuracy” to others’ beliefs: subjects selectively attribute higher informational value to social signals that reinforce their motivation.
Conspiracy Culture in American History (C-SPAN)
Indiana University Bloomington professor Stephen Andrews teaches about how conspiracy theories have changed over time, but often include involvement of groups such as the Illuminati, Freemasons, and Skull and Bones. His lecture discusses how in the 1950s a prominent aspect of conspiracy theories was the threat of communism, but in later decades a global “New World Order” was a more common feature. This is the first of a two-part seminar hosted by the Gilder Lehrman Institute of American History.
Conspiracy Culture in Modern American Society (C-SPAN)
Indiana University Bloomington professor Stephen Andrews looks at the demographics of what types of people believe in conspiracies, and talks about how the Internet has influenced these groups. He discusses strategies teachers might use when speaking with students or peers about theories related to a fake moon landing, flat earth or 9/11 as a government action. This is the second of a two-part seminar hosted by the Gilder Lehrman Institute of American History.
Explainer: why conspiracy theories are so popular after tragedies (Abbie Richards via Twitter)
A short video that outlines the reasons disasters and other events of mass harm tend to introduce a lot of misinformation into the public discourse.
Conspiracy beliefs in Britain and North Macedonia: A comparative study (Ana Stojanov and Karen Douglas via International Journal of Psychology)
Support for democratic principles, low trust in institutions and media were significant predictors of conspiracy beliefs.
“You need to do your research”: Vaccines, contestable science, and maternal epistemology (Melissa L. Carrion via Public Understanding of Science)
Participants’ explanations for vaccine refusal relied on paradoxical arguments
about science and expertise. On one hand, participants defended the ideal of science but criticized existing research for failing to meet requisite standards. On the other hand, they suggested that maternal experience could supplant the ways of knowing that give rise to such claims.
MASS PSYCHOSIS - How an Entire Population Becomes MENTALLY ILL (After Skool, in collaboration with Academy of Ideas, via YouTube)
This video explore the most dangerous of all psychic epidemics, the mass psychosis. A mass psychosis is an epidemic of madness and it occurs when a large portion of a society loses touch with reality and descends into delusions. Such a phenomenon is not a thing of fiction. Two examples of mass psychoses are the American and European witch hunts 16th and 17th centuries and the rise of totalitarianism in the 20th century.