Data & Privacy
AI & Trust
Cybersecurity
Digital Services & Media
CHAPTER I
GENERAL PROVISIONSArticles 1 — 2
CHAPTER II
RIGHTS AND DUTIES OF MEDIA SERVICE PROVIDERS AND RECIPIENTS OF MEDIA SERVICESArticles 3 — 6
CHAPTER III
FRAMEWORK FOR REGULATORY COOPERATION AND A WELL-FUNCTIONING INTERNAL MARKET FOR MEDIA SERVICESArticles 7 — 25
CHAPTER IV
FINAL PROVISIONSArticles 26 — 29
Where there is reasonable doubt concerning the media service provider’s compliance with point (d) of the first subparagraph, the provider of a very large online platform shall seek confirmation on the matter from the relevant national regulatory authority or body or the relevant co-regulatory or self-regulatory mechanism.
Where, following or in the absence of a reply as referred to in point (b) of the first subparagraph, the provider of a very large online platform takes a decision to suspend or restrict visibility, it shall inform the media service provider concerned without undue delay. This paragraph shall not apply where providers of very large online platforms suspend the provision of their services in relation to content provided by a media service provider or restrict the visibility of such content in compliance with their obligations pursuant to Articles 28, 34 and 35 of Regulation (EU) 2022/2065 and Article 28b of Directive 2010/13/EU or with their obligations relating to illegal content pursuant to Union law.
Very large online platforms act for many users as a gateway for providing access to media content and media services. Media service providers that exercise editorial responsibility over their content play a key role in the distribution of information and in the exercise of the right to receive and impart information online. When exercising such editorial responsibility, media service providers are expected to act diligently and provide information that is trustworthy and respectful of fundamental rights, in line with the regulatory requirements or co-regulatory or self-regulatory mechanisms to which they are subject in the Member States. Therefore, also in view of users’ right to receive and impart information, where a provider of a very large online platform considers that content provided by such media service providers is incompatible with its terms and conditions, it should duly consider media freedom and media pluralism, in accordance with Regulation (EU) 2022/2065, and provide, as early as possible, the necessary explanations to media service providers in a statement of reasons as referred to in Article 4(1) of Regulation (EU) 2019/1150 of the European Parliament and of the Council and Article 17 of Regulation (EU) 2022/2065. To minimise the impact of any restriction to that content on users’ right to receive and impart information, very large online platforms should submit their statement of reasons prior to the suspension or restriction of visibility taking effect. In addition, they should provide the media service provider concerned with an opportunity to reply to the statement of reasons within 24 hours of receiving it, prior to the suspension or restriction of visibility taking effect. A shorter timeframe could apply in the event of a crisis as referred to in Article 36(2) of Regulation (EU) 2022/2065 in order to take into account, in particular, an urgent need to moderate the relevant content in such exceptional circumstances.
The use of labelling or age verification tools by providers of very large online platforms in accordance with their terms of service and with Union law should not be understood as a restriction of visibility. Following the reply of a media service provider to the statement of reasons by a provider of a very large online platform, or in the absence of such a reply within the given period of time, that provider of a very large online platform should inform the media service provider if it intends to proceed with the suspension of the provision of its online intermediation services in relation to the content provided by the media service provider or the restriction of the visibility of that content. This Regulation should not affect the obligations of providers of very large online platforms to take measures against illegal content disseminated through their services, to take measures in order to assess and mitigate systemic risks posed by their services, for example through disinformation, or to take measures in order to protect minors. In that context, nothing in this Regulation should be construed as deviating from the obligations of providers of very large online platforms pursuant to Articles 28, 34 and 35 of Regulation (EU) 2022/2065 and Article 28b of Directive 2010/13/EU.
It is justified, in view of an expected positive impact on the freedom to provide services and the freedom of expression, that where media service providers comply with certain regulatory, co-regulatory or self-regulatory standards, their complaints against decisions of providers of very large online platforms be treated with priority and without undue delay.
To that end, providers of very large online platforms providing access to media content should provide a functionality on their online interface to enable media service providers to declare that they meet certain requirements, while at the same time retaining the possibility to reject such self-declarations where they consider that those conditions are not met. When a media service provider declares itself compliant with regulatory requirements or a co-regulatory or self-regulatory mechanism, it should be able to provide the contact details of the relevant national regulatory authority or body or of the representatives of the co-regulatory or self-regulatory mechanism, including those provided by widely-recognised professional associations representing a given sector and operating at Union or national level. Where there is a reasonable doubt, that information would enable the provider of a very large online platform to confirm with those authorities or bodies whether the media service provider is subject to such requirements or mechanisms. Where relevant, providers of very large online platforms should rely on information regarding adherence to those requirements, such as the machine-readable standard of the Journalism Trust Initiative, developed under the aegis of the European Committee for Standardisation, or other relevant codes of conduct. Recognised civil society organisations, fact-checking organisations and other relevant professional organisations recognising the integrity of media sources on the basis of standards agreed with the media industry should also have the possibility to flag to the providers of very large online platforms any potential issue regarding compliance by media service providers with the relevant requirements for the self-declaration. Guidelines issued by the Commission would be key to facilitate an effective implementation of such a functionality. Those guidelines should contribute to minimising the risk of potential abuse of the functionality, in particular by media service providers that engage systematically in disinformation, information manipulation and interference, including those controlled by certain third countries, taking into account the criteria to be developed by the Board regarding media service providers from outside the Union. For that purpose, those guidelines could cover arrangements related to the involvement of recognised civil society organisations, including fact-checking organisations, in the review of the declarations or to the consultation of national regulatory authorities or bodies or co-regulatory or self-regulatory bodies.
This Regulation recognises the importance of co-regulatory and self-regulatory mechanisms in the context of the provision of media services on very large online platforms. Such mechanisms represent a type of voluntary initiative, for instance in the form of codes of conduct, which enables media service providers or their representatives to adopt common guidelines, including on ethical standards, the correction of errors or complaint handling, amongst themselves and for themselves. Robust, inclusive and widely accepted media self-regulation represents an effective guarantee of the quality and professionalism of media services and is key to safeguarding editorial integrity.
Providers of very large online platforms should engage in a dialogue with media service providers that respect standards of credibility and transparency and that consider that restrictions on or suspensions of their content are repeatedly imposed by providers of very large online platforms without sufficient grounds, in order to find an amicable solution for terminating any unjustified restrictions or suspensions and avoiding them in the future. Providers of very large online platforms should engage in such dialogues in good faith, paying particular attention to safeguarding media freedom and the freedom of information. The Board should inform the Commission of its opinions on the outcome of such dialogues. The Commission could take such opinions into account in the context of the enforcement of Regulation (EU) 2022/2065.