All articles

What Is the EU AI Act and Why Does It Matter for Your Business?

A business guide to the EU AI Act — what it regulates, who it affects, and what the deadlines are.

3 min read
ComplianceForge AI

The EU AI Act (Regulation EU 2024/1689) is the world's first comprehensive law on artificial intelligence. It entered into force in August 2024, with compliance deadlines phased in over time — some obligations have been active since February 2025, while high-risk system deadlines have been pushed to December 2027 (Omnibus VII, March 2026).

Who Does It Affect?

Every company in the EU that uses AI tools — not just those that develop them. If your employees use ChatGPT, Copilot, AI in your CRM, or any other AI tool, the EU AI Act applies to you.

The regulation distinguishes two key roles:

  • Provider — the company that develops the AI system
  • Deployer — the company that uses the AI system in its operations

Most companies are deployers, and have specific obligations depending on the risk category of the AI systems they use.

Four Risk Categories

The EU AI Act classifies AI systems into 4 categories:

1. Prohibited Practices (Article 5)

Social scoring, manipulative AI, mass collection of biometric data. These practices have been banned since February 2025.

2. High Risk (Annex III)

AI systems affecting employment, credit decisions, education, biometrics, critical infrastructure. They require a risk management system, technical documentation, human oversight, and transparency.

3. Limited Risk (Article 50)

Chatbots, content generators, deepfake tools. They must inform users that they are interacting with AI.

4. Minimal Risk

Writing marketing content, translation, code generation. No specific obligations, but a voluntary code of conduct is recommended.

Key Deadlines

DateWhat Takes Effect
February 2025Ban on prohibited AI practices + AI literacy obligation (ALREADY ACTIVE)
August 2025Obligations for GPAI models (ALREADY ACTIVE)
November 2026Transparency obligations (Art. 50) — watermarking, content labeling
December 2027High-risk AI systems — standalone (Annex III)
August 2028High-risk AI in regulated products (Annex I embedded)

What Do You Need to Do?

  1. Identify all AI tools you use
  2. Classify each tool according to EU AI Act risk categories
  3. Document the risk assessment for high-risk systems
  4. Implement the necessary controls (human oversight, transparency)
  5. Prepare compliance documentation for auditors

ComplianceForge AI automates steps 1-5 through an intelligent wizard that guides you through the entire process.

Want to know your compliance status?

Free questionnaire, AI classification and compliance score in 30 minutes.