ComplianceForge AI
How does it work?Risk categoriesPricingFAQKnowledge Base417Blog
Login
Knowledge BaseGuidelinesAI Literacy — Article 4 of the EU AI Act in Detail
GUIDELINE

AI Literacy — Article 4 of the EU AI Act in Detail

Detailed analysis of Article 4 of the EU AI Act on AI literacy — who must have a programme, what it means in practice, best practices for implementation, and programme examples.

Source document

What is the AI Literacy obligation?

Article 4 of the EU AI Act introduces an AI literacy obligation for all organisations that develop or use AI systems. This obligation has been in force since 2 February 2025 — it was one of the first provisions to come into effect.

Unlike other EU AI Act obligations that apply only to certain risk levels, the AI literacy obligation is universal — it applies to everyone, regardless of whether your AI systems are classified as high-risk or not.

Text of Article 4

Providers and deployers of AI systems shall take measures to ensure, to their best extent, a sufficient level of AI literacy of their staff and other persons dealing with the operation and use of AI systems on their behalf, taking into account their technical knowledge, experience, education and training and the context in which the AI systems are to be used, and considering the persons or groups of persons on whom the AI systems are to be used.

Who must have a programme

Obligated parties

CategoryExampleObligation
ProvidersCompany developing an AI chatbotYes — for the development team and everyone working with AI
DeployersBank using AI for credit scoringYes — for everyone using or overseeing the AI system
Importers and distributorsCompany reselling an AI toolYes — for staff involved in the supply chain
SMEs and micro-enterprisesStartup with 5 employeesYes — but with proportionate resources
Public bodiesMinistry, agencyYes — especially for AI-assisted decision-making

Who to include

The AI literacy programme must cover:

  1. Staff developing AI systems — engineers, data scientists
  2. Staff using AI systems — operators, tool users
  3. Staff overseeing AI systems — managers, compliance officers
  4. Staff making decisions based on AI outputs — HR, judges, physicians

What "AI literacy" means in practice

The EU AI Act does not prescribe a formal curriculum or certification. Instead, it requires a "sufficient level of AI literacy" adapted to the context. This means understanding:

Basic (all employees)

  • What AI is and how it works at a high level
  • Which AI tools your organisation uses
  • Limitations of AI systems (bias, hallucinations, reliability)
  • When and how to escalate an AI-related issue

Advanced (operators and supervisors)

  • How to interpret AI outputs
  • How to identify incorrect or biased results
  • Human oversight procedures
  • Documentation obligations under the EU AI Act

Specialised (developers and data scientists)

  • Technical requirements of the EU AI Act for the relevant risk level
  • Data governance principles (Art. 10)
  • Testing and validation of AI systems
  • Model transparency and explainability

Best practices for implementation

1. Needs Assessment (Gap Analysis)

Before creating a programme, assess:

  • Which AI systems does your organisation use?
  • Who uses those systems and in what context?
  • What is the current level of AI knowledge among your employees?
  • What are the specific risks in your industry?

2. Layered approach

Create a programme in layers:

Layer 1: General AI literacy          -> All employees (1-2 hours)
Layer 2: AI in your organisation      -> AI tool users (half day)
Layer 3: Compliance and oversight     -> Managers, COs (1 day)
Layer 4: Technical AI literacy        -> IT/Dev team (2-3 days)

3. Practical format

Effective programmes include:

  • Hands-on workshops with the tools employees actually use
  • Case studies from your industry
  • Simulations of decision-making with AI outputs
  • Regular refreshers (quarterly or after significant changes)

4. Documentation

Document your programme for compliance purposes:

  • List of participants and training dates
  • Programme content and materials
  • Evaluation results (tests, feedback)
  • Regular refresh schedule

Programme examples

Example 1: Small marketing agency (10 employees)

  • Tools: ChatGPT, Midjourney, AI copywriting
  • Programme: 2-hour workshop for everyone + policy document
  • Content: When to use AI and when not to; verifying AI output; IP issues; transparency towards clients
  • Cost: Minimal (internal resources)

Example 2: Mid-sized financial institution (200 employees)

  • Tools: AI credit scoring, fraud detection, chatbot
  • Programme: E-learning platform + quarterly workshops
  • Content: General module (all) + specialised modules per department + compliance module for management
  • Cost: EUR 5,000-15,000 annually

Example 3: Large telecom operator (2,000+ employees)

  • Tools: Numerous AI systems across the entire organisation
  • Programme: Formal AI academy with certification
  • Content: 4 certification levels, mentoring programme, red team exercises
  • Cost: EUR 50,000-100,000 annually

How ComplianceForge helps

ComplianceForge automatically generates:

  1. AI Literacy Programme (DOC-07) — Tailored to your specific AI systems and industry
  2. Training recommendations — In your action plan, with concrete steps
  3. Score metric — AI Literacy dimension in your compliance score

Use the free AI Literacy Check on our homepage for a quick assessment of your organisation's current status.

Omnibus VII clarifications

Omnibus VII (March 2026) clarifies Art. 4:

  • "Sufficient measures" refer to the context of use, not to a formal programme
  • SMEs may use free AI Office resources instead of expensive proprietary programmes
  • Emphasis on proportionality — larger organisation = more robust programme

Related articles

  • Omnibus VII — What Changed — Context for Art. 4 clarifications
  • Guidelines on High-Risk AI Systems — Additional obligations for high-risk

Sources

  • AI Act Art. 4 — AI Literacy
  • AI Office — AI Literacy Resources
  • ISO/IEC 42001 — AI Management System — Annex B covers competence and awareness

Need compliance documentation?

Generate AI Inventory, Risk Assessment and other documents automatically — tailored to your system.

Register for freeSee example

Quick compliance check

Find out in 5 min if your AI system is high-risk and what you need to do.

Start questionnaire