Logo
StreamLex Home
Logo
StreamLex Home
Laws
Laws
Recitals
Recitals
Your feedback
About UsNewsletterTerms of UsePrivacy NoticeLinkedIn
Digital Services Act
  • Data & Privacy

    • Data Act
    • Data Governance Act
    • ePrivacy Directive
    • GDPR
  • AI & Trust

    • Artificial Intelligence Act
  • Cybersecurity

    • Cybersecurity Act
    • DORA
    • NIS2
  • Digital Services & Media

    • Digital Markets Act
    • Digital Services Act
    • European Media Freedom Act
DSA

DSA Article 34. Risk assessment

  • 1.
    Providers of very large online platforms and of very large online search engines shall diligently identify, analyse and assess any systemic risks in the Union stemming from the design or functioning of their service and its related systems, including algorithmic systems, or from the use made of their services. They shall carry out the risk assessments by the date of application referred to in Article 33(6), second subparagraph, and at least once every year thereafter, and in any event prior to deploying functionalities that are likely to have a critical impact on the risks identified pursuant to this Article. This risk assessment shall be specific to their services and proportionate to the systemic risks, taking into consideration their severity and probability, and shall include the following systemic risks:
    • (a)
      the dissemination of illegal content through their services;
    • (b)
      any actual or foreseeable negative effects for the exercise of fundamental rights, in particular the fundamental rights to human dignity enshrined in Article 1 of the Charter, to respect for private and family life enshrined in Article 7 of the Charter, to the protection of personal data enshrined in Article 8 of the Charter, to freedom of expression and information, including the freedom and pluralism of the media, enshrined in Article 11 of the Charter, to non-discrimination enshrined in Article 21 of the Charter, to respect for the rights of the child enshrined in Article 24 of the Charter and to a high-level of consumer protection enshrined in Article 38 of the Charter;
    • (c)
      any actual or foreseeable negative effects on civic discourse and electoral processes, and public security;
    • (d)
      any actual or foreseeable negative effects in relation to gender-based violence, the protection of public health and minors and serious negative consequences to the person’s physical and mental well-being.
  • 2.
    When conducting risk assessments, providers of very large online platforms and of very large online search engines shall take into account, in particular, whether and how the following factors influence any of the systemic risks referred to in paragraph 1:
    • (a)
      the design of their recommender systems and any other relevant algorithmic system;
    • (b)
      their content moderation systems;
    • (c)
      the applicable terms and conditions and their enforcement;
    • (d)
      systems for selecting and presenting advertisements;
    • (e)
      data related practices of the provider.

    The assessments shall also analyse whether and how the risks pursuant to paragraph 1 are influenced by intentional manipulation of their service, including by inauthentic use or automated exploitation of the service, as well as the amplification and potentially rapid and wide dissemination of illegal content and of information that is incompatible with their terms and conditions. The assessment shall take into account specific regional or linguistic aspects, including when specific to a Member State.

  • 3.
    Providers of very large online platforms and of very large online search engines shall preserve the supporting documents of the risk assessments for at least three years after the performance of risk assessments, and shall, upon request, communicate them to the Commission and to the Digital Services Coordinator of establishment.

Relevant Recitals for this Article

© 2024 StreamLex

NewsletterAbout UsTerms of UsePrivacy NoticeManage cookies

© 2024 StreamLex