by Streamlex 2 August 2025
The EU Artificial Intelligence Act (AI Act) sets out strict requirements for General-Purpose AI (GPAI) models, with obligations for providers coming into force on 2 August 2025. From transparency rules to copyright compliance, GPAI providers must prepare now to meet the Act’s standards and avoid enforcement action starting 2 August 2026. This guide compiles Streamlex’s Top 5 Official Resources released by the European Commission, designed to help GPAI providers understand and comply with their AI Act obligations.
Under the EU Artificial Intelligence Act (AI Act), General-Purpose AI (GPAI) is defined as:
“An AI model, including where such an AI model is trained with a large amount of data using self-supervision at scale, that displays significant generality and is capable of competently performing a wide range of distinct tasks.”
Because of their broad use, GPAI models carry unique compliance burdens. Providers must meet strict transparency, safety, and copyright obligations when placing GPAI models on the EU market. The timeline for compliance with these obligations depends on when the GPAI model was placed on the market - before or after 2 August 2025.
For GPAI models placed on the market AFTER 2 August 2025:
For GPAI models placed on the market BEFORE 2 August 2025:
The European Commission has released ca number of official guidance materials to help GPAI providers navigate compliance. Here’s what you need to know about each:
Adopted on 18 July 2025
The non-binding guidelines under Article 96(1) of the AI Act provide the Commission's authoritative interpretation of GPAI obligations, establishing technical criteria for compliance.
The guidelines introduce a compute threshold approach, defining GPAI models as those trained using more than 10²³ FLOPS (floating point operations per second) and capable of generating language, text-to-image, or text-to-video outputs. This threshold corresponds approximately to models with one billion parameters trained on substantial datasets.
The guidelines clarify that models exceeding 10²⁵ FLOPS are presumed to pose systemic risk, triggering additional obligations including mandatory notification to the AI Office within two weeks. The framework addresses the full model lifecycle, extending obligations from the start of pre-training through all subsequent development phases, including post-market modifications.
Key Takeaways:
Access full European Commission's documents: Guidelines, Guidelines FAQ, GPAI Models FAQ.
Adopted on 24 July 2025
Unlike the voluntary nature of other guidance documents, this template (document available here) developed under Article 53(1)(d) of the AI Act establishes mandatory disclosure requirements for all GPAI providers. The template structures training data summaries into three comprehensive sections: general information (provider details, model characteristics, data modalities), list of data sources (including major datasets and detailed web-scraping disclosures), and relevant data processing aspects (copyright considerations and illegal content removal measures).
The template requires differentiated disclosure levels based on data source types. For web-scraped content, providers must disclose the top 10% of domain names by content volume scraped (5% for SMEs), while licensed datasets require only confirmation of licensing agreements and data modalities. The framework balances transparency with trade secret protection, explicitly allowing providers to withhold commercially sensitive information where appropriate.
Key Takeaways:
Access full European Commission's documents: Template, Template FAQ.
The Chapter was published as a part of the GPAI Code of Practice. The full Code of Practice was published on July 10, 2025, with the Commission confirming its adequacy on August 1, 2025.
The transparency chapter applies to all GPAI providers who sign the Code, with specific exceptions for free and open-source models without systemic risk.
This chapter establishes a standardized documentation framework through the Model Documentation Form, which consolidates all required transparency information into a single comprehensive document. The form covers licensing details, technical specifications, intended use cases, dataset information, compute and energy usage, and more. Documentation must be maintained for at least 10 years and made available to the AI Office and downstream providers within 14 days of request.
Key Takeaways:
Access full European Commission's documents: Transparency Chapter, Model Documentation Form, GPAI Code of Practice FAQ, Signing GPAI Code of Practice FAQ
The Chapter was published as a part of the GPAI Code of Practice. The full Code of Practice was published on July 10, 2025, with the Commission confirming its adequacy on August 1, 2025.
This chapter addresses Article 53(1)(c) requirements for GPAI providers to adopt copyright compliance policies The chapter mandates comprehensive copyright policy frameworks that go beyond basic compliance to establish proactive protection mechanisms. Providers must develop and maintain up-to-date policies that clearly define internal accountability structures and demonstrate adherence to EU copyright law, including Article 4(3) of Directive (EU) 2019/790.
Key Takeaways:
Access full European Commission's documents: Copyright Chapter, GPAI Code of Practice FAQ, Signing GPAI Code of Practice FAQ
The Chapter was published as a part of the GPAI Code of Practice. The full Code of Practice was published on July 10, 2025, with the Commission confirming its adequacy on August 1, 2025.
This chapter applies exclusively to GPAI models with systemic risk, implementing Article 55 obligations for state-of-the-art safety and security measures.
The chapter establishes a comprehensive AI safety framework, requiring providers to develop cutting-edge Safety & Security Frameworks before model release. These frameworks must outline evaluation triggers, risk categories, mitigation strategies, forecasting methods, and organizational responsibilities, with regular updates responding to new risks or incidents.
Key Takeaways:
Access full European Commission's documents: Safety & Security Chapter, GPAI Code of Practice FAQ, Signing GPAI Code of Practice FAQ
The EU AI Act is reshaping how GPAI models are developed and deployed. These applications are relevant not only to current GPAI providers. If you are a so-called ‘downstream actor’ and make significant changes to an AI model you’re using, you may legally be considered a provider. Not every modification will trigger this designation — the change must be significant in terms of the model’s generality, capabilities, or systemic risk. It is thus important to understand how the roles across the GPAI lifecycle are assigned and what legal obligations such roles come with.
👉 Follow Streamlex.eu for real-time updates, compliance checklists, and full access to official EU AI Act documents.
When do GPAI obligations under the EU AI Act begin?
GPAI obligations start on 2 August 2025, with enforcement and fines for the models placed on the market after 2 of August 2025, beginning a year later.
What qualifies as a GPAI model under the AI Act?
Any AI model trained at scale that can perform a wide range of tasks across different domains and exhibit significant degree of generality. A model qualifies as a general-purpose AI model if the computational resources used for its training (training compute) exceed 10^23 FLOP and it can generate language (text or audio), text-to-image, or text-to-video.
What is a GPAI model with systematic risk under the AI Act?
A GPAI model is classified as having systemic risk if it meets one of two conditions.
Who needs to comply with the Code of Practice?
Any GPAI provider that signs the Code (with some exceptions for open-source projects) must comply with the Transparency, Copyright, and Safety & Security chapters. Signing the code is voluntary. The list of signatories is available here.
I am not a GPAI provider, but I use AI models. Does this apply to me?
Yes, in some cases. If you are a so-called ‘downstream actor’ and make significant changes to an AI model you’re using, you may legally be considered a provider. Not every modification will trigger this designation — the change must be significant in terms of the model’s generality, capabilities, or systemic risk. More details are available in the European Commission’s latest guidelines, included in our resources above.