Skip to content
Longterm Wiki
Back

Goddard et al. (2012)

web

Credibility Rating

4/5
High(4)

High quality. Established institution or organization with editorial oversight and accountability.

Rating inherited from publication venue: ScienceDirect

A foundational empirical study on automation bias relevant to AI safety discussions about human oversight, corrigibility, and the risks of humans deferring excessively to AI recommendations in high-stakes decision environments.

Metadata

Importance: 55/100journal articleprimary source

Summary

This paper examines automation bias, the tendency for humans to over-rely on automated decision-support systems, leading to errors of omission and commission. It explores how people fail to adequately monitor automated systems and accept their outputs without sufficient critical evaluation. The research has significant implications for the design of human-AI interaction systems and the allocation of decision authority.

Key Points

  • Automation bias occurs when humans over-rely on automated systems, accepting incorrect outputs or failing to act when automation does not prompt action.
  • Two key error types are identified: errors of omission (failing to notice problems the automation missed) and errors of commission (following incorrect automated recommendations).
  • The degree of automation and operator workload significantly influence the likelihood and severity of automation bias.
  • Design of human-machine interfaces must account for cognitive tendencies toward over-reliance to maintain meaningful human oversight.
  • Findings are relevant to AI safety concerns about maintaining effective human control over increasingly capable automated systems.

Cited by 1 page

PageTypeQuality
AI-Human Hybrid SystemsApproach91.0

Cached Content Preview

HTTP 200Fetched Mar 15, 20260 KB
# Are you a robot?

Please confirm you are a human by completing the captcha challenge below.

- **Reference number:** 9dc83576c84b3b47
- **IP Address:** 5.183.90.224
Resource ID: 0553835ccb1cde82 | Stable ID: YzAxNGJlZG