Data & Privacy
AI & Trust
Cybersecurity
Digital Services & Media
CHAPTER I
GENERAL PROVISIONSArticles 1 — 3
CHAPTER II
LIABILITY OF PROVIDERS OF INTERMEDIARY SERVICESArticles 4 — 10
CHAPTER III
DUE DILIGENCE OBLIGATIONS FOR A TRANSPARENT AND SAFE ONLINE ENVIRONMENTArticles 11 — 48
CHAPTER IV
IMPLEMENTATION, COOPERATION, PENALTIES AND ENFORCEMENTArticles 49 — 88
CHAPTER V
FINAL PROVISIONSArticles 89 — 93
Those reports shall include an explanation of the procedures in place to ensure that the trusted flagger retains its independence. Trusted flaggers shall send those reports to the awarding Digital Services Coordinator, and shall make them publicly available. The information in those reports shall not contain personal data.
Action against illegal content can be taken more quickly and reliably where providers of online platforms take the necessary measures to ensure that notices submitted by trusted flaggers, acting within their designated area of expertise, through the notice and action mechanisms required by this Regulation are treated with priority, without prejudice to the requirement to process and decide upon all notices submitted under those mechanisms in a timely, diligent and non-arbitrary manner. Such trusted flagger status should be awarded by the Digital Services Coordinator of the Member State in which the applicant is established and should be recognised by all providers of online platforms within the scope of this Regulation. Such trusted flagger status should only be awarded to entities, and not individuals, that have demonstrated, among other things, that they have particular expertise and competence in tackling illegal content and that they work in a diligent, accurate and objective manner. Such entities can be public in nature, such as, for terrorist content, internet referral units of national law enforcement authorities or of the European Union Agency for Law Enforcement Cooperation (‘Europol’) or they can be non-governmental organisations and private or semi-public bodies such as the organisations part of the INHOPE network of hotlines for reporting child sexual abuse material and organisations committed to notifying illegal racist and xenophobic expressions online. To avoid diminishing the added value of such mechanism, the overall number of trusted flaggers awarded in accordance with this Regulation should be limited. In particular, industry associations representing their members' interests are encouraged to apply for the status of trusted flaggers, without prejudice to the right of private entities or individuals to enter into bilateral agreements with the providers of online platforms.
Trusted flaggers should publish easily comprehensible and detailed reports on notices submitted in accordance with this Regulation. Those reports should indicate information such as the number of notices categorised by the provider of hosting services, the type of content, and the action taken by the provider. Given that trusted flaggers have demonstrated expertise and competence, the processing of notices submitted by trusted flaggers can be expected to be less burdensome and therefore faster compared to notices submitted by other recipients of the service. However, the average time taken to process may still vary depending on factors including the type of illegal content, the quality of notices, and the actual technical procedures put in place for the submission of such notices.