Back
The Influence of Risky Conditions in Trust in Autonomous Systems
webSAGE Journals(peer-reviewed)·journals.sagepub.com/doi/10.1177/1541931213601562
Credibility Rating
4/5
High(4)High quality. Established institution or organization with editorial oversight and accountability.
Rating inherited from publication venue: SAGE Journals
A human factors conference paper relevant to AI safety discussions around trust calibration, human-automation teaming, and the risks of over-reliance on autonomous systems in high-stakes scenarios.
Metadata
Importance: 38/100journal articleprimary source
Summary
This paper investigates how risky or high-stakes conditions affect human trust in autonomous systems, exploring the relationship between perceived risk, system reliability, and operator willingness to rely on automation. It contributes to understanding how trust calibration in autonomous systems varies with environmental and situational risk factors.
Key Points
- •Risky conditions can significantly alter how much trust operators place in autonomous systems, sometimes leading to over- or under-reliance.
- •Trust calibration—matching trust levels to actual system reliability—is a key challenge in human-automation interaction.
- •High-risk environments may cause operators to default to automation even when manual control would be more appropriate.
- •The study has implications for designing autonomous systems that communicate reliability and uncertainty to human operators.
- •Findings are relevant to safety-critical domains where misplaced trust in automation can lead to accidents or mission failures.
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| AI-Induced Expertise Atrophy | Risk | 65.0 |
Cached Content Preview
HTTP 200Fetched Mar 15, 202633 KB
[Skip to main content](https://journals.sagepub.com/doi/10.1177/1541931213601562#skipNavigationTo)
Intended for healthcare professionals
[Proceedings of the Human Factors and Ergonomics Society Annual Meeting](https://journals.sagepub.com/home/PRO)
[](http://www.hfes.org/ "")
[Journal indexing and metrics](https://journals.sagepub.com/metrics/pro)
[Journal Homepage](https://journals.sagepub.com/home/PRO)
[Submission Guidelines](https://journals.sagepub.com/author-instructions/PRO)
Contents
- [Abstract](https://journals.sagepub.com/doi/10.1177/1541931213601562#abstract)
- [References](https://journals.sagepub.com/doi/10.1177/1541931213601562#bibliography)
[Get access](https://journals.sagepub.com/doi/10.1177/1541931213601562#core-collateral-purchase-access)
[More](https://journals.sagepub.com/doi/10.1177/1541931213601562#core-collateral-more)
## Abstract
In order to utilize the full range of benefits of autonomous systems, an understanding of how operators trust an automated system is vital. The level of risk in an environment is an important factor that many have suggested affects trust in automated and autonomous systems, but it has not been studied extensively. This study aims to explore the effect differing levels of risk can have on trust in an autonomous system with individuals that vary in their own risk profile. Using a UAV management task, participants worked with an autonomous teammate to protect an area from incoming enemies. Risk was assessed by how much money a participant stood to lose by not protecting the safe zone. Results partially support the hypothesis that trust decreases with increased risk, but results vary regarding behavioral trust and subjective trust. Overall, this experiment provides evidence that risk is an important situational factor that affects trust in automation.
## Get full access to this article
View all access and purchase options for this article.
[Get Access](https://journals.sagepub.com/doi/10.1177/1541931213601562#core-collateral-purchase-access)
## References
Bagheri N., Jamieson G. A. (2004). Considering subjective trust and monitoring behavior in assessing automation-induced “complacency”. _Human performance, situation awareness, and automation: Current research and trends_, 54-59.
[Google Scholar](https://scholar.google.com/scholar_lookup?title=Considering+subjective+trust+and+monitoring+behavior+in+assessing+automation-induced+%E2%80%9Ccomplacency&author=N.+Bagheri&author=G.+A.+Jamieson&publication_year=2004&journal=Human+performance%2C+situation+awareness%2C+and+automation%3A+Current+research+and+trends&pages=54-59)
Bailey N. R., Scerbo M. W. (2007). Automation-induced complacency for monitoring highly reliable systems: The role of task complexity, system experience, and operator trust. _Theoretical Issues in Ergonomics Science_,8(4), 321-348.
[Crossref](
... (truncated, 33 KB total)Resource ID:
14ac1982ca58bfa9 | Stable ID: MThkYjU0Mm