Data & Privacy
AI & Trust
Cybersecurity
Digital Services & Media
CHAPTER I
GENERAL PROVISIONSArticles 1 — 4
CHAPTER II
PROHIBITED AI PRACTICESArticles 5 — 5
CHAPTER III
HIGH-RISK AI SYSTEMSArticles 6 — 49
CHAPTER IV
TRANSPARENCY OBLIGATIONS FOR PROVIDERS AND DEPLOYERS OF CERTAIN AI SYSTEMSArticles 50 — 50
CHAPTER V
GENERAL-PURPOSE AI MODELSArticles 51 — 56
CHAPTER VI
MEASURES IN SUPPORT OF INNOVATIONArticles 57 — 63
CHAPTER VII
GOVERNANCEArticles 64 — 70
CHAPTER VIII
EU DATABASE FOR HIGH-RISK AI SYSTEMSArticles 71 — 71
CHAPTER IX
POST-MARKET MONITORING, INFORMATION SHARING AND MARKET SURVEILLANCEArticles 72 — 94
CHAPTER X
CODES OF CONDUCT AND GUIDELINESArticles 95 — 96
CHAPTER XI
DELEGATION OF POWER AND COMMITTEE PROCEDUREArticles 97 — 98
CHAPTER XII
PENALTIESArticles 99 — 101
CHAPTER XIII
FINAL PROVISIONSArticles 102 — 113
ANNEXES
The conformity assessment procedure based on internal control is the conformity assessment procedure based on points 2, 3 and 4.
The provider verifies that the established quality management system is in compliance with the requirements of Article 17.
The provider examines the information contained in the technical documentation in order to assess the compliance of the AI system with the relevant essential requirements set out in Chapter III, Section 2.
The provider also verifies that the design and development process of the AI system and its post-market monitoring as referred to in Article 72 is consistent with the technical documentation.