Skip to content
MEOK.AI
🚀 Activate your agent

Free forever · No credit card

← All EU AI Act articles

EU AI Act Article 9 — Risk Management System

Article 9 is the upstream of every other high-risk AI obligation — Articles 10 (data governance), 14 (human oversight), 72 (post-market monitoring), and 73 (incident reporting) all assume an Article 9 RMS exists and is live. Get this one wrong and the whole compliance chain fails.

What Article 9 requires

  • 9(2)(a) — Identify + analyse known and reasonably foreseeable risks to health, safety, fundamental rights when used in accordance with the intended purpose.
  • 9(2)(b) — Estimate + evaluate risks emerging when used in accordance with intended purpose AND under conditions of reasonably foreseeable misuse.
  • 9(2)(c) — Evaluate other possibly arising risks based on data analysis from the post-market monitoring system referred to in Article 72.
  • 9(2)(d) — Adopt appropriate + targeted risk management measures designed to address the risks identified.
  • 9(3) — Risk management measures must give due consideration to combined application of requirements in Section 2.
  • 9(5) — Eliminate or reduce risks through design + development; where elimination is not possible, implement adequate mitigation + control measures; provide information to deployers.
  • 9(8) — Take into account when implementing the RMS whether the AI system is likely to be accessed by, or have an impact on, persons under the age of 18 — and other vulnerable groups.

Common audit failures

  • Static "RMS document" written once, never updated — fails 9(1) "continuous iterative process."
  • No risk register linking each identified risk to a specific mitigation measure + ownership.
  • No measurement of mitigation effectiveness over time.
  • RMS doesn't integrate with Article 72 post-market monitoring data — fails 9(2)(c).
  • "Reasonably foreseeable misuse" not analysed — fails 9(2)(b).
  • No consideration of children + vulnerable groups — fails 9(8).

How MEOK covers Article 9

  • meok-governance-engine-mcp — generates and maintains a living risk register cross-mapped to Article 9 + ISO 42001 Annex A.6 + NIST AI RMF MEASURE.
  • meok-omnibus-tracker-mcp — tracks deadlines + delays so your RMS reflects current obligations (e.g., Annex III delayed to 2 Dec 2027).
  • meok-attestation-verify — every RMS update emits a signed cert with a public verify URL.
  • /audit-prep-bundle (£4,950) — full Article 9 RMS implementation in 14 days, signed evidence pack delivered.

Frequently asked

What does Article 9 actually require?

Article 9 requires providers of high-risk AI systems to establish, implement, document and maintain a Risk Management System (RMS) as a continuous, iterative process throughout the AI system's entire lifecycle. The RMS must identify + analyze + evaluate + mitigate foreseeable risks to health, safety, fundamental rights.

Who has to comply with Article 9?

Providers of high-risk AI systems per Article 6 + Annex III (employment, education, law enforcement, migration, public services, biometric identification, critical infrastructure, etc.). High-risk Annex III enforcement was delayed by the Digital Omnibus to 2 December 2027 — but you should be implementing now to be ready.

Is Article 9 a one-time audit or continuous?

Continuous. The RMS must run throughout the AI system's lifecycle: pre-deployment risk identification, post-market monitoring (Article 72), incident reporting (Article 73), regular review + update. A static document is not Article 9-compliant.

How does Article 9 relate to ISO/IEC 42001?

ISO/IEC 42001:2023 Annex A.6 (risk management) closely mirrors Article 9. Many controls satisfy both. MEOK's governance-engine MCP cross-walks every Article 9 sub-clause to ISO 42001 Annex A and NIST AI RMF MEASURE controls.

What's the typical Article 9 audit failure mode?

Three common fails: (1) RMS document exists but no living ledger of risks identified + tracked over time. (2) Mitigation measures listed but no effectiveness measurement. (3) No integration with Article 72 post-market monitoring + Article 73 incident reporting — Article 9 is meant to be the upstream of both.

How does MEOK help?

meok-governance-engine-mcp emits a continuous risk register cross-mapped to Article 9 + ISO 42001 + NIST AI RMF. Every entry signed via meok-attestation-api. /audit-prep-bundle (£4,950) wraps Article 9 + Article 10 + Article 14 + Article 26 + Article 50 + Article 72 in a 14-day engagement.

£4,950 Audit-Prep Bundle →Free scorecard →

Source: EU AI Act Regulation 2024/1689 Art. 9 · MEOK AI Labs · CSOAI LTD · UK Companies House 16939677