ISO/IEC 42001 — Artificial Intelligence Management System
ISO/IEC 42001:2023 is the first international standard for artificial intelligence management systems (AIMS), helping organisations establish, implement, and continuously improve a responsible approach to AI systems.
Source documentWhat is this document?
ISO/IEC 42001:2023 (Artificial Intelligence Management System — AIMS) is the first international standard that specifies requirements for establishing, implementing, maintaining, and continuously improving an artificial intelligence management system within organisations. It was published on 18 December 2023 by the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC).
The standard was developed within ISO/IEC JTC 1/SC 42 — Artificial Intelligence and is applicable to organisations of all sizes and industries that develop, produce, or use AI systems.
Key points
Standard structure
ISO/IEC 42001 follows the management system structure known as Annex SL (Harmonized Structure), making it compatible with other ISO standards such as ISO 9001, ISO 27001, and ISO 14001. It is based on the PDCA (Plan-Do-Check-Act) approach:
- Plan — Understanding the organisational context, interested parties, defining the scope and AI system policy
- Do — Implementing processes, managing risks, allocating resources
- Check — Monitoring, measuring, internal audits, and management review
- Act — Continuous improvement based on findings
Key requirements
The standard covers the following areas:
- Context of the organisation — Understanding internal and external factors affecting AI systems
- Leadership — Leadership commitment, AI policy, roles and responsibilities
- Planning — Risk and opportunity management, AI system objectives
- Support — Resources, competencies, awareness, communication, and documentation
- Operation — AI system impact assessment, lifecycle management, third parties
- Performance evaluation — Monitoring, measurement, analysis, internal audits
- Improvement — Nonconformities, corrective actions, continuous improvement
AI system impact assessment
One of the key elements of the standard is the requirement for an AI system impact assessment, which covers:
- Impact on individuals and groups (including fundamental rights)
- Ethical considerations (fairness, transparency, accountability)
- Social and environmental impact
- AI-specific risks (bias, explainability, robustness)
Standard annexes
The standard includes informative annexes that provide practical guidance:
- Annex A — AI management system controls
- Annex B — Implementation guidance for controls
- Annex C — Potential objectives and sources of risk related to AI
- Annex D — Use of AI management systems across different domains
How does it apply to organisations?
Relevance to the EU AI Act
Although ISO/IEC 42001 is not a harmonised standard under the EU AI Act, it provides a structured framework that helps organisations meet many of the Regulation's requirements:
- Quality management system (Art. 17) — ISO 42001 directly supports establishing a system that satisfies AI Act requirements
- Risk management (Art. 9) — AIMS includes systematic AI system risk management
- Documentation (Art. 11) — The standard requires extensive process and system documentation
- Post-market monitoring (Art. 72) — The PDCA cycle ensures continuous monitoring
Certification
Organisations can obtain ISO/IEC 42001 certification from an independent Conformity Assessment Body (CAB). Certification demonstrates that the organisation's AI management system has been verified against internationally recognised standards.
Practical implementation steps
- Gap analysis — Assess the current state of AI system management within the organisation
- Define scope — Determine which AI systems and processes the AIMS covers
- Risk assessment — Identify and assess risks associated with your AI systems
- Implement controls — Apply controls from Annex A tailored to your context
- Documentation — Develop the required documentation (policies, procedures, records)
- Internal audits — Conduct internal audits to verify effectiveness
- Certification — Optionally, seek external certification
Relevant EU AI Act articles
| Article | Connection to ISO 42001 |
|---|---|
| Art. 9 | Risk management -> AIMS risk assessment |
| Art. 10 | Data governance -> Data lifecycle controls |
| Art. 11 | Technical documentation -> Documentation requirements |
| Art. 17 | Quality management system -> AIMS framework |
| Art. 72 | Post-market monitoring -> PDCA continuous improvement |
Source document
Official standard page: ISO/IEC 42001:2023 — AI management systems
Standard explanation by ISO: ISO 42001 explained
Need compliance documentation?
Generate AI Inventory, Risk Assessment and other documents automatically — tailored to your system.