Skip to content
Longterm Wiki
Back

Pennycook & Rand (2021)

paper

Credibility Rating

5/5
Gold(5)

Gold standard. Rigorous peer review, high editorial standards, and strong institutional reputation.

Rating inherited from publication venue: Nature

Relevant to AI safety discourse on epistemic ecosystems and platform governance; provides empirical grounding for lightweight behavioral interventions that improve information quality without censorship.

Metadata

Importance: 62/100journal articleprimary source

Summary

Pennycook & Rand (2021) demonstrate that people share misinformation not due to partisan preferences but due to inattention to accuracy. Simple prompts asking users to evaluate headline accuracy significantly improve the quality of news shared, validated across survey experiments and a Twitter field experiment.

Key Points

  • Sharing intentions are far less discerning than accuracy judgments, suggesting sharing misinformation does not necessarily reflect belief in it.
  • People report valuing accurate sharing, yet fail to act on this preference because attention is directed elsewhere during sharing decisions.
  • Subtle accuracy-nudge prompts (asking users to rate one headline's accuracy) significantly increased quality of subsequently shared content on Twitter.
  • Findings challenge the 'partisanship over accuracy' hypothesis, attributing misinformation spread primarily to attentional inattention rather than motivated reasoning.
  • Results support scalable, low-cost platform interventions to reduce misinformation spread without requiring content removal or heavy-handed moderation.

Cited by 1 page

PageTypeQuality
Epistemic Learned HelplessnessRisk53.0

Cached Content Preview

HTTP 200Fetched Mar 15, 202627 KB
Shifting attention to accuracy can reduce misinformation online | Nature 
 
 
 

 
 

 
 

 

 
 
 
 

 

 
 
 
 
 
 

 
 
 
 
 
 

 
 

 
 
 
 
 
 
 
 
 
 
 

 
 

 

 

 
 

 
 
 

 
 

 
 
 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 
 
 
 
 
 
 
 
 

 
 
 

 
 Skip to main content 

 
 
 
 Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
 the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
 Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
 and JavaScript.

 
 

 

 

 
 
 

 
 
 Advertisement

 
 
 
 
 
 
 
 
 
 
 

 
 
 
 

 

 
 
 
 

 

 

 
 
 
 
 
 
 
 
 

 
 
 Subjects

 
 Communication 
 Decision making 
 Human behaviour 
 Technology 

 
 

 
 
 

 
 

 
 

 
 Abstract

 In recent years, there has been a great deal of concern about the proliferation of false and misleading news on social media 1 , 2 , 3 , 4 . Academics and practitioners alike have asked why people share such misinformation, and sought solutions to reduce the sharing of misinformation 5 , 6 , 7 . Here, we attempt to address both of these questions. First, we find that the veracity of headlines has little effect on sharing intentions, despite having a large effect on judgments of accuracy. This dissociation suggests that sharing does not necessarily indicate belief. Nonetheless, most participants say it is important to share only accurate news. To shed light on this apparent contradiction, we carried out four survey experiments and a field experiment on Twitter; the results show that subtly shifting attention to accuracy increases the quality of news that people subsequently share. Together with additional computational analyses, these findings indicate that people often share misinformation because their attention is focused on factors other than accuracy—and therefore they fail to implement a strongly held preference for accurate sharing. Our results challenge the popular claim that people value partisanship over accuracy 8 , 9 , and provide evidence for scalable attention-based interventions that social media platforms could easily implement to counter misinformation online.

 

 
 
 
 
 
 
 
 
 
 Access through your institution 
 
 
 
 
 
 
 
 Buy or subscribe 
 
 
 
 
 
 

 
 
 

 
 
 
 
 
 
 This is a preview of subscription content, access via your institution 

 
 
 

 

 Access options

 

 
 
 
 
 
 
 
 Access through your institution 
 
 
 
 
 
 

 

 
 
 
 
 
 Access Nature and 54 other Nature Portfolio journals
 

 
 Get Nature+, our best-value online-access subscription
 

 
 
 $32.99 / 30 days 
 

 cancel any time

 
 
 Learn more 
 
 
 
 
 Subscribe to this journal

 
 Rec

... (truncated, 27 KB total)
Resource ID: 2f1ad598aa1b787a | Stable ID: Y2RmMmM5M2