Skip to content
Longterm Wiki
Back

PMC Academic - Obligations to assess: Recent trends in AI accountability regulations

paper

Authors

Serena Oduro·Emanuel Moss·Jacob Metcalf

Credibility Rating

4/5
High(4)

High quality. Established institution or organization with editorial oversight and accountability.

Rating inherited from publication venue: PubMed Central

This peer-reviewed journal article analyzes emerging AI accountability regulations and their shift toward impact assessments and governance documentation, directly addressing regulatory approaches to managing risks from automated decision systems.

Paper Details

Citations
15
Year
2022
Methodology
peer-reviewed
Categories
Patterns

Metadata

journal articleanalysis

Summary

This paper examines recent trends in AI accountability regulations that require developers to conduct impact assessments of automated decision systems across social, economic, and ethical dimensions. Analyzing four legislative examples from the US and EU, the authors demonstrate how regulations are shifting beyond technical assessments toward accountability documentation as a governance mechanism. The paper identifies three core concerns these regulations address: identifying and documenting harms, ensuring public transparency, and enforcing anti-discrimination rules. The authors provide insights for system designers on preparing for and complying with emerging regulatory requirements.

Cited by 1 page

PageTypeQuality
US State AI Legislation LandscapeAnalysis70.0

Cached Content Preview

HTTP 200Fetched Mar 15, 202643 KB
Obligations to assess: Recent trends in AI accountability regulations - PMC
 

 
 
 
 
 
 
 
 
 
 
 
 

 
 
 

 
 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 
 
 
 
 Skip to main content
 

 
 

 
 
 

 
 
 
 
 
 
 Official websites use .gov 
 

 A
 .gov website belongs to an official
 government organization in the United States.
 

 
 

 
 

 
 
 Secure .gov websites use HTTPS 
 

 A lock (
 
 
 Lock 
 
 Locked padlock icon
 
 
 
 ) or https:// means you've safely
 connected to the .gov website. Share sensitive
 information only on official, secure websites.
 

 
 
 
 
 
 

 

 
 
 
 

 
 

 
 
 

 
 
 
 
 
 
 
 
 
 
 
 
 

 
 Search PMC Full-Text Archive 
 
 
 
 
 Search in PMC 
 
 
 
 
 
 
 
 
 Journal List
 
 
 

 
 
 
 User Guide
 
 
 

 
 
 
 
 

 

 
 

 
 

 

 

 
 
 
 
 
 
 
 
 
 

 
 
 
 
 
 

 
 
 
 

 
 
 

 
 
 
 
 
 
 

 
 
 
 
 
 

 
 
 
 
 
 

 
 
 PERMALINK

 
 
 
 
 Copy 
 
 
 
 
 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 As a library, NLM provides access to scientific literature. Inclusion in an NLM database does not imply endorsement of, or agreement with,
 the contents by NLM or the National Institutes of Health.

 Learn more:
 PMC Disclaimer 
 |
 
 PMC Copyright Notice
 
 
 
 
 
 
 

 
 
 Patterns (N Y) . 2022 Nov 11;3(11):100608. doi: 10.1016/j.patter.2022.100608 
 
 
 Obligations to assess: Recent trends in AI accountability regulations

 
 Serena Oduro 
 Serena Oduro 

 
 1 Data & Society Research Institute, New York, NY 10011, USA 
 Find articles by Serena Oduro 
 
 
 1 , Emanuel Moss 
 Emanuel Moss 

 
 2 Intel Labs, Hillsboro, OR 97124, USA 
 Find articles by Emanuel Moss 
 
 
 2 , Jacob Metcalf 
 Jacob Metcalf 

 
 1 Data & Society Research Institute, New York, NY 10011, USA 
 Find articles by Jacob Metcalf 
 
 
 1, ∗ 
 
 
 Author information 

 Article notes 

 Copyright and License information 

 
 
 
 
 1 Data & Society Research Institute, New York, NY 10011, USA 
 
 2 Intel Labs, Hillsboro, OR 97124, USA 
 
 ∗ Corresponding author jake.metcalf@datasociety.net 

 
 
 Collection date 2022 Nov 11.

 
 
 © 2022 The Authors 
 This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/).

 PMC Copyright notice 
 
 
 PMCID: PMC9676559  PMID: 36419454 
 
 See editorial " Responsible and accountable data science ", 100629. 
 Summary

 Policymakers are increasingly turning toward assessments of social, economic, and ethical impacts as a governance model for automated decision systems in sensitive or regulated domains. In both the United States and the European Union, recently proposed legislation would require developers to assess the impacts of their systems for individuals, communities, and society, a notable step beyond the technical assessments that are familiar to the industry. This paper analyzes four examples of such legislation in order to illustrate how AI regulations are moving toward using accountability documentation to address common AI acco

... (truncated, 43 KB total)
Resource ID: f4ba840569bf2bb5 | Stable ID: ZGI1NDhjOT