EU AI Act for EdTech
Annex III(3) + GDPR children data + FRIA. We ship the evidence pack.
Admissions AI, automated grading, AI proctoring, adaptive learning are Annex III(3) high-risk. With children's data the stack thickens — GDPR Article 8 + Article 35 DPIA + EU AI Act Article 5(b) + Article 26(9) FRIA. National supervisory authorities pay close attention to this sector.
The EdTech AI compliance stack
- EU AI Act Annex III(3) — high-risk classification for education AI.
- EU AI Act Article 5(b) — ban on exploiting children's vulnerabilities (in force now).
- EU AI Act Article 26(9) FRIA — mandatory for deployers (schools, universities, training providers).
- GDPR Article 8 — children's consent, age 13-16 by member state.
- GDPR Article 35 DPIA — mandatory for systematic monitoring of children.
- GDPR Article 22 — automated-decision protection for grades / admissions.
- National sector laws — France Code de l'éducation + national rules on automated grading vary.
Frequently asked
Is EdTech AI high-risk?
Annex III(3) lists education AI as high-risk: '(a) AI systems intended to be used to determine access or admission or to assign natural persons to educational and vocational training institutions, (b) to evaluate learning outcomes, including when those outcomes are used to steer the learning process, (c) for the purpose of assessing the appropriate level of education that a person will receive, (d) for monitoring and detecting prohibited behaviour during tests in the context of training and education.' Admissions AI, automated grading, AI proctoring, adaptive-learning AI — all in scope.
What about children's data?
GDPR Article 8 protects children — minimum age for valid consent for information-society services is between 13-16 depending on member state. Article 35 DPIA is mandatory whenever children's data is processed. EU AI Act Article 5(b) bans AI that exploits vulnerabilities of persons due to their age. Stack: GDPR Art. 8 + Art. 35 DPIA + AI Act Article 5 ban + Annex III high-risk + Article 26(9) FRIA.
What's required for AI proctoring?
Especially scrutinised. Annex III(3)(d) explicitly captures AI to detect prohibited behaviour during tests. Bias mitigation (Article 10) for facial recognition under different lighting/skin tones is mandatory. Human oversight (Article 14) must be effective — false-positive flags must be reviewed by a competent natural person before academic consequence. Transparency to test-takers (Article 13 + 50) about AI use. National supervisory authorities are paying close attention.
When does this take effect?
Annex III high-risk obligations now apply from 2 December 2027 after Digital Omnibus delay. Article 4 (literacy) is binding since 2 February 2025 — for EdTech this includes student-facing literacy as well as staff. Article 5 (no exploitation of children's vulnerabilities) is fully in force. GDPR DPIA requirements are immediate.
What does MEOK ship for EdTech?
FRIA generator (free at /scorecard) seeded with EdTech Annex III categories. Article 10 bias-detection (£299/mo) for grading + admissions + proctoring AI. Article 13 transparency logs (£399/mo) for student/parent disclosure. /audit-prep-bundle £4,950 for full Annex IV + FRIA in 14 days.
Win EU university procurement, don't lose it
Universities + schools now require pre-built Annex III evidence in vendor RFPs. Ship the FRIA + DPIA + bias dashboard and you get on shortlist; don't and you don't.
MEOK AI Labs · CSOAI LTD · UK Companies House 16939677