EU AI Act Article 26 — Deployer Obligations + FRIA
Most EU AI Act content focuses on AI providers. Article 26 is the parallel set of obligations for deployers — the companies actually using the AI in production. Banks, insurers, public bodies, and HR-tech buyers should read this most carefully.
What Article 26 requires
- 26(1) — Take appropriate technical + organisational measures to ensure use is in accordance with the provider's instructions for use.
- 26(2) — Assign human oversight to natural persons with the necessary competence, training, authority, and support.
- 26(3) — Ensure input data is relevant + sufficiently representative for the intended purpose where deployer exercises control.
- 26(4) — Monitor the operation based on instructions for use; inform the provider/distributor + stop use where there is reason to consider use is not in accordance.
- 26(5) — Retain logs automatically generated by the high-risk AI system for at least 6 months unless otherwise required by EU/national law.
- 26(6) — Inform workers' representatives + affected workers prior to deploying / using a high-risk AI system at the workplace.
- 26(7) — Public authority deployers must register the use in the EU public database for high-risk AI.
- 26(8) — Use information from the provider for own DPIA under GDPR Article 35.
- 26(9) — FRIA: public-sector deployers + private deployers in credit/insurance must perform a Fundamental Rights Impact Assessment prior to first use.
- 26(11) — Inform natural persons subject to AI-based decision-making in the cases listed in Annex III.
How MEOK covers Article 26
- meok-dpia-edpb-template-mcp — auto-fills the EDPB harmonised DPIA template (14 April 2026) which doubles as your FRIA per Article 26(9).
- meok-governance-engine-mcp — Article 26 deployer-obligation crosswalk to ISO 42001 Annex A.7 + NIST AI RMF GOVERN.
- /audit-prep-bundle (£4,950) — Article 26 + 26(9) FRIA in a 14-day signed engagement.
Frequently asked
Who is a 'deployer' under Article 26?
A deployer is any natural or legal person, public authority, agency or other body using an AI system under their authority — except where the AI system is used in the course of a personal non-professional activity. If you put a high-risk AI into production, you're a deployer.
What's a Fundamental Rights Impact Assessment (FRIA)?
Per Article 26(9), public-sector deployers + private deployers using AI for credit-worthiness, life/health insurance pricing must perform a FRIA prior to first use. The FRIA assesses risks to fundamental rights of natural persons that may arise from use of the AI. The EDPB published a harmonised DPIA template on 14 April 2026 that explicitly maps to FRIA inputs.
Is FRIA the same as a GDPR DPIA?
Related but not identical. A GDPR Article 35 DPIA assesses risks to data-protection rights. A FRIA under Article 26(9) is broader — it covers all fundamental rights (freedom, equality, dignity, etc.) and is specific to AI systems. The EDPB harmonised template lets you fulfil both with one document.
When does the FRIA obligation kick in?
Article 113 timeline: Article 26 obligations apply 24 months after entry into force = 2 August 2026. The Digital Omnibus delayed Annex III high-risk to 2 December 2027 for some categories, but FRIA obligations on listed deployers (banks, insurers, public bodies) apply from 2 Aug 2026.
How does MEOK cover Article 26 / FRIA?
meok-dpia-edpb-template-mcp auto-fills the EDPB harmonised DPIA template — which IS the de-facto FRIA template since the EDPB explicitly designed it for AI Act use. /audit-prep-bundle (£4,950) wraps Article 26 + 26(9) FRIA in a 14-day engagement.
Source: EU AI Act Regulation 2024/1689 Art. 26 · EDPB harmonised DPIA template (14 Apr 2026) · MEOK AI Labs · CSOAI LTD · UK Companies House 16939677