EU AI Act Standardisation
webCredibility Rating
High quality. Established institution or organization with editorial oversight and accountability.
Rating inherited from publication venue: European Union
Official EU Commission page on AI Act standardisation efforts; essential reference for understanding how the EU's binding AI regulation translates into technical compliance standards via CEN/CENELEC, relevant to anyone tracking AI governance and global regulatory benchmarking.
Metadata
Summary
This European Commission page explains how harmonised technical standards are being developed under the EU AI Act to translate legal requirements into common technical language. CEN and CENELEC, working through Joint Technical Committee JTC 21, are developing standards across ten key areas including risk management, transparency, and cybersecurity. Compliance with published harmonised standards creates a legal presumption of conformity with the AI Act.
Key Points
- •CEN/CENELEC's JTC 21 is developing harmonised standards in 10 areas: risk management, dataset governance, transparency, human oversight, accuracy, robustness, cybersecurity, quality management, and conformity assessment.
- •Harmonised standards referenced in the EU Official Journal provide legal certainty; companies applying them are presumed compliant with AI Act requirements.
- •prEN 18286 (AI Quality Management System) became the first AI harmonised standard to enter public enquiry on 30 October 2025, targeting Article 17 compliance.
- •Standards primarily focus on 'high-risk' AI systems affecting safety, health, and fundamental rights in domains like critical infrastructure and law enforcement.
- •European harmonised standards are positioned to become de facto global benchmarks, particularly for risk and quality management methodologies.
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| AI Standards Development | Concept | 69.0 |
Cached Content Preview
[Skip to main content](https://digital-strategy.ec.europa.eu/en/policies/ai-act-standardisation#main-content)
# Standardisation of the AI Act
Harmonised standards will offer legal certainty under the AI Act, support innovation, and position the EU to set global benchmarks for trustworthy AI.
Ensuring an effective and clear implementation of the [AI Act](https://digital-strategy.ec.europa.eu/en/policies/regulatory-framework-ai) is a priority for the Commission. This act regulates regulates ‘high-risk’ AI systems that impact safety, health, and fundamental rights, for example, in critical infrastructure and law enforcement, among others (see [article 6 and annex III of the AI Act](https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=CELEX%3A32024R1689)). These requirements need to be fulfilled before placement on the market, ensuring high-risk AI systems are monitored throughout their lifecycle.
Standards translate legal requirements into common technical language, simplifying compliance for companies and other stakeholders.
**European harmonised standards serve several crucial functions:**
- **Legal certainty and reduced compliance costs**: European harmonised standards provide a clear pathway to compliance for businesses of all sizes
- **Market benchmarking**: European harmonised standards often become de facto global benchmarks. For example, standards currently under development focused on setting methodologies for risk management and quality management are strong candidates to become market benchmarks in the future.
- **Innovation and competitiveness**: European harmonised standards foster trust and market acceptance, enabling developers who adopt them to compete on a global scale while ensuring their solutions meet the highest safety standards.
## How are standards developed and what are the current standards?
The European Committee for Standardisation (CEN) and European Committee for Electrotechnical Standardisation (CENELEC) are European Standardisation Organisations. Working groups in these 2 organisations are actively developing harmonised standards for high-risk AI systems. They work together in a [Joint Technical Committee called JTC 21](https://jtc21.eu/).
The [European Commission has requested that CEN and CENELEC develop standards in ten key areas](https://ec.europa.eu/transparency/documents-register/detail?ref=C(2025)3871&lang=en):
- risk management
- governance and quality of datasets
- record keeping
- transparency
- human oversight
- accuracy
- robustness
- cybersecurity
- quality management
- conformity assessment
Once harmonised standards are published by CEN and CENELEC, the Commission assesses whether they meet the intended objectives and legal requirements of the AI Act. After this final step, the standards are referenced in the [Official Journal of the EU.](https://eur-lex.europa.eu/)
The application of standards remains voluntary. Providers can choose any other framework to demonstrate their compliance with the
... (truncated, 9 KB total)34305df930191b74 | Stable ID: OTM5ZmJkYj