Skip to content
MEOK.AI
🚀 Activate your agent

Free forever · No credit card

← All EU AI Act articles

EU AI Act Article 13 — Transparency to Deployers

Article 13 is the B2B-documentation backbone of the EU AI Act. Every high-risk provider must give deployers enough information to operate the system safely. Shipping a high-risk AI without this documentation = automatic Article 99 fine exposure.

What Article 13 requires

  • 13(1) — High-risk AI systems must be designed + developed so deployers can interpret the system's output and use it appropriately.
  • 13(2) — High-risk AI systems must be accompanied by instructions for use containing concise, complete, correct, clear information that is relevant, accessible, and comprehensible to deployers.
  • 13(3) — Instructions for use must contain at least: provider identity + contact details, system characteristics + capabilities + limitations, performance metrics + intended purpose, foreseeable circumstances leading to risk, computational + hardware resources, lifecycle expectations + maintenance + cybersecurity measures, training set characteristics where appropriate, human oversight provisions per Article 14.

How MEOK covers Article 13

  • meok-governance-engine-mcp — generates Article 13 instructions-for-use template mapped to your high-risk system + crosswalked to Annex IV technical documentation requirements.
  • /transparency £399-£1,499/mo — continuous decision-trace logging + signed transparency attestations (the runtime continuation of Article 13 documentation obligations).
  • /audit-prep-bundle £4,950 — full Article 13 + Annex IV in 14 days with signed evidence pack.

Frequently asked

What does Article 13 actually require?

Providers of high-risk AI systems must design and develop them so their operation is sufficiently transparent — specifically so deployers can interpret the system's output and use it appropriately. The system must come with instructions for use including: (a) provider identity, (b) characteristics, capabilities, limitations, (c) circumstances that may lead to risks to health/safety/fundamental rights, (d) human oversight measures, (e) computational and hardware resources needed, (f) lifecycle + maintenance + cybersecurity.

How does Article 13 differ from Article 50?

Article 13 is provider-to-deployer transparency (B2B documentation). Article 50 is deployer-to-end-user disclosure (consumer-facing notices, watermarks, deepfake disclosure). Different audiences, different timing, different evidence.

Is Article 13 the same as Annex IV technical documentation?

Closely linked. Annex IV is the format of the technical documentation a provider must keep. Article 13 specifies what must be communicated to the deployer (often a subset of Annex IV plus operational instructions). The /docs page on a B2B AI vendor's website is essentially their Article 13 documentation surface.

When does Article 13 take effect?

Article 13 binds providers of high-risk AI systems. The Digital Omnibus delayed Annex III high-risk to 2 December 2027 / Annex I high-risk to 2 August 2028. So providers building Annex III high-risk systems have until late 2027 to ship Article 13 compliant documentation.

How does MEOK help?

meok-governance-engine-mcp generates a templated Article 13 instructions-for-use document mapped to your specific high-risk AI system. /audit-prep-bundle £4,950 wraps Article 13 + Annex IV + Articles 9/10/14/26 in a 14-day signed evidence pack.

£399/mo Transparency →£4,950 Audit-Prep Bundle →

Source: EU AI Act Regulation 2024/1689 Art. 13 · MEOK AI Labs · CSOAI LTD · UK Companies House 16939677