ComplianceForge AI
How does it work?Risk categoriesPricingFAQKnowledge Base417Blog
Login
Knowledge BaseStandardsISO/IEC 42001 — Artificial Intelligence Management System
STANDARD

ISO/IEC 42001 — Artificial Intelligence Management System

ISO/IEC 42001:2023 is the first international standard for artificial intelligence management systems (AIMS), helping organisations establish, implement, and continuously improve a responsible approach to AI systems.

Source document

What is this document?

ISO/IEC 42001:2023 (Artificial Intelligence Management System — AIMS) is the first international standard that specifies requirements for establishing, implementing, maintaining, and continuously improving an artificial intelligence management system within organisations. It was published on 18 December 2023 by the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC).

The standard was developed within ISO/IEC JTC 1/SC 42 — Artificial Intelligence and is applicable to organisations of all sizes and industries that develop, produce, or use AI systems.

Key points

Standard structure

ISO/IEC 42001 follows the management system structure known as Annex SL (Harmonized Structure), making it compatible with other ISO standards such as ISO 9001, ISO 27001, and ISO 14001. It is based on the PDCA (Plan-Do-Check-Act) approach:

  • Plan — Understanding the organisational context, interested parties, defining the scope and AI system policy
  • Do — Implementing processes, managing risks, allocating resources
  • Check — Monitoring, measuring, internal audits, and management review
  • Act — Continuous improvement based on findings

Key requirements

The standard covers the following areas:

  1. Context of the organisation — Understanding internal and external factors affecting AI systems
  2. Leadership — Leadership commitment, AI policy, roles and responsibilities
  3. Planning — Risk and opportunity management, AI system objectives
  4. Support — Resources, competencies, awareness, communication, and documentation
  5. Operation — AI system impact assessment, lifecycle management, third parties
  6. Performance evaluation — Monitoring, measurement, analysis, internal audits
  7. Improvement — Nonconformities, corrective actions, continuous improvement

AI system impact assessment

One of the key elements of the standard is the requirement for an AI system impact assessment, which covers:

  • Impact on individuals and groups (including fundamental rights)
  • Ethical considerations (fairness, transparency, accountability)
  • Social and environmental impact
  • AI-specific risks (bias, explainability, robustness)

Standard annexes

The standard includes informative annexes that provide practical guidance:

  • Annex A — AI management system controls
  • Annex B — Implementation guidance for controls
  • Annex C — Potential objectives and sources of risk related to AI
  • Annex D — Use of AI management systems across different domains

How does it apply to organisations?

Relevance to the EU AI Act

Although ISO/IEC 42001 is not a harmonised standard under the EU AI Act, it provides a structured framework that helps organisations meet many of the Regulation's requirements:

  • Quality management system (Art. 17) — ISO 42001 directly supports establishing a system that satisfies AI Act requirements
  • Risk management (Art. 9) — AIMS includes systematic AI system risk management
  • Documentation (Art. 11) — The standard requires extensive process and system documentation
  • Post-market monitoring (Art. 72) — The PDCA cycle ensures continuous monitoring

Certification

Organisations can obtain ISO/IEC 42001 certification from an independent Conformity Assessment Body (CAB). Certification demonstrates that the organisation's AI management system has been verified against internationally recognised standards.

Practical implementation steps

  1. Gap analysis — Assess the current state of AI system management within the organisation
  2. Define scope — Determine which AI systems and processes the AIMS covers
  3. Risk assessment — Identify and assess risks associated with your AI systems
  4. Implement controls — Apply controls from Annex A tailored to your context
  5. Documentation — Develop the required documentation (policies, procedures, records)
  6. Internal audits — Conduct internal audits to verify effectiveness
  7. Certification — Optionally, seek external certification

Relevant EU AI Act articles

ArticleConnection to ISO 42001
Art. 9Risk management -> AIMS risk assessment
Art. 10Data governance -> Data lifecycle controls
Art. 11Technical documentation -> Documentation requirements
Art. 17Quality management system -> AIMS framework
Art. 72Post-market monitoring -> PDCA continuous improvement

Source document

Official standard page: ISO/IEC 42001:2023 — AI management systems

Standard explanation by ISO: ISO 42001 explained

Need compliance documentation?

Generate AI Inventory, Risk Assessment and other documents automatically — tailored to your system.

Register for freeSee example

Quick compliance check

Find out in 5 min if your AI system is high-risk and what you need to do.

Start questionnaire