In March 2026, at least 6 new open-source tools related to the EU AI Act appeared on Hacker News and Reddit. Article 12 Logging, EU AI Act Scanner, Vectimus, G0, OpenObscure, Claude Code Scanner — each solves a piece of the compliance puzzle.
That's great news. It means the ecosystem is growing and there are options for every budget and technical need. But it also means it's harder than ever to understand which tool you actually need.
This is an honest review — no "our tool is better." Different tools solve different problems, for different people.
Open-source tools: overview
1. Article 12 Logging Infrastructure
Logging infrastructure for fulfilling Article 12 requirements of the EU AI Act — automatic recording of AI system operations. It received 42 upvotes on Hacker News with comments like "This is a genuinely useful piece of infrastructure."
For: Dev teams building AI systems who need compliance logging from day one.
Strengths: Integrates into existing code, open-source, focused on one specific legal requirement.
Limitations: Covers only logging, doesn't generate documentation or assess risk. One HN commenter aptly noted: "Anyone can generate an alternative chain of sha256 hashes" — logging alone doesn't prove integrity without additional mechanisms.
2. EU AI Act Scanner
A Python tool that scans projects for compliance signals — looks for AI model usage, training data, risk indicators in code.
For: Developers who want a quick audit of an existing project.
Strengths: Easy to run, good for initial assessment.
Limitations: One HN commenter warned: "Mind you, that EU AI Act Scanner is quite naive — it doesn't adhere to the law by pushing code into a cloud scanner." Scanning code doesn't replace legal risk analysis.
3. Vectimus
Cedar policy enforcement for AI agents — defines rules for what an agent is allowed to do using the Cedar policy language.
For: Teams working with AI agents who need granular access controls.
Strengths: Cedar is proven (Amazon uses it), policy-as-code approach.
Limitations: Specific to agent-based systems, doesn't cover broader compliance documentation.
4. G0
A "control layer" for AI agents — scan, test, monitor, comply. Promises a comprehensive approach to AI system control.
For: Teams that need runtime monitoring and testing of AI agents.
Strengths: Covers multiple aspects (scan + test + monitor), not just one.
Limitations: Relatively new project, limited community feedback so far.
5. OpenObscure
A privacy firewall for AI agents — uses FPE encryption and a "cognitive firewall" to protect private data processed by AI agents. Focused on Article 5 (prohibited practices).
For: Developers integrating AI agents with sensitive user data.
Strengths: On-device, open-source, specifically designed for privacy protection.
Limitations: Addresses privacy, not broader compliance requirements.
6. Claude Code Scanner
Scans repositories for GDPR and EU AI Act signals — looks for patterns in code that indicate potential compliance issues.
For: Developers using Claude Code who want automated repository auditing.
Strengths: Integrated into existing dev workflow, free.
Limitations: A scanning tool — finds potential issues, doesn't resolve them.
SaaS tools: the other half of the puzzle
The open-source tools above generally do one thing well: scan, log, or control AI systems at the technical level. That's useful and important for developers.
But a compliance officer who needs to prepare audit documentation has different needs. They don't scan code — they write AI Inventory Registers, Risk Assessments, Human Oversight Plans, Transparency Reports. They need a system that:
- Classifies risk for each AI system according to Article 6 and Annex III
- Generates compliance documents based on legal requirements
- Tracks progress through an action plan with deadlines and verification
- Packages evidence for audit — evidence package with hash integrity
That's what SaaS tools like ComplianceForge AI do. Not because they're "better" than open-source scanners — but because they solve a different part of the problem.
Comparison: Scanning vs Generation vs Tracking
| Capability | Open-source tools | SaaS (ComplianceForge AI) |
|---|---|---|
| Code scanning for AI signals | ✅ (Scanner, Claude Code Scanner) | ❌ |
| Runtime logging / monitoring | ✅ (Article 12 Logging, G0) | ❌ |
| Policy enforcement | ✅ (Vectimus, Cedar) | ❌ |
| Privacy protection | ✅ (OpenObscure) | ❌ |
| AI risk classification | ❌ | ✅ |
| Compliance documents (11 types) | ❌ | ✅ |
| Gap analysis | ❌ | ✅ |
| Action plan with tracking | ❌ | ✅ |
| Evidence package for audit | ❌ | ✅ |
| Four-eyes verification | ❌ | ✅ |
The pattern is clear: these tools complement each other, they don't compete.
Who needs what?
If you're a developer building an AI system — open-source tools are an excellent starting point. Integrate logging, run a scanner, set up policy enforcement. These are the foundations of technical compliance.
If you're a compliance officer who needs to prepare audit documentation — you need a tool that generates documents, tracks progress, and packages evidence. A scanner won't write your Risk Assessment.
If you're both (CTO of a small company, for example) — combine them. Use open-source tools for the technical side, SaaS for documentation. There's no reason to choose just one.
Conclusion: the ecosystem is growing, and that's good
The emergence of 6 new open-source tools in a single month shows that EU AI Act compliance is moving from an abstract problem to a concrete engineering challenge. That's healthy.
Our advice: don't look for one tool that solves everything. Compliance is a puzzle — choose tools that cover your specific pieces. If you need logging, grab Article 12 Logging. If you need policy enforcement for agents, check out Vectimus. If you need documents, an action plan, and an evidence package — we're here.
Most importantly: start now. Transparency obligations (Art. 50) begin in November 2026 — 8 months away. Regardless of which tool you choose, "we'll start preparing" is no longer an option.
- Quick risk check → — find out your AI system's risk category in 2 minutes
- Compliance questionnaire → — detailed assessment with gap analysis and action plan
- AI literacy check → — free AI literacy assessment
Sources: Hacker News — Article 12 Logging, Hacker News — EU AI Act Scanner, Hacker News — Vectimus, Hacker News — G0, r/foss — OpenObscure, r/ClaudeAI — Claude Code Scanner