Hogan Lovells 2024 Election Impact and Congressional Outlook Report
The European Parliament's recent approval of the AI Act entails a significant step in regulating artificial intelligence systems in the European Union. Among its pivotal provisions, artificial intelligence (AI) providers of high-risk systems shall prepare comprehensive technical documentation before market introduction. This documentation is crucial for demonstrating compliance with the AI Act's requirements, for facilitating assessment by regulatory bodies and to provide comfort to deployers. This post provides an overview of its main content and useful tips for providers to prepare.
In a landmark move, the European Parliament recently approved the AI Act, a comprehensive regulatory framework aimed at governing the development, deployment, and use of AI systems within the European Union. Final approval of the AI Act is expected in the next weeks. For a general overview of the principles under the AI Act, see our impact analysis of the AI Act (part 1 and part 2).
Among its many provisions, article 11 stands out as a key requirement for AI providers of high-risk AI systems: the drafting of technical documentation.
General-purpose AI models are also subject to an obligation to have in place technical documentation but we will address that in a separate publication.
Article 11 of the AI Act requires technical documentation of high-risk AI systems to be prepared before they are introduced to the market or put into service. This documentation must not only be comprehensive but also kept up-to-date throughout the lifecycle of the AI system. Its primary purpose is to demonstrate compliance with the requirements outlined in the different applicable sections of the AI Act and provide clear and comprehensive information for national competent authorities and to the deployers. The technical documentation is therefore a key component to be examined as part of the necessary conformity assessment procedures for high-risk AI system. In addition, it also forms one factor to be considered when assessing the “intended purpose” of the AI system (which is a relevant aspect to scope the obligations of a high-risk system falling under Annex III of the AI Act).
The technical documentation, as outlined in Annex IV of the AI Act, is structured to cover various aspects crucial for assessing the AI system's compliance and to provide deployers with the necessary information to duly understand and manage the AI system in place. The documentation must include a description of (among other elements):
The technical information shall be kept for a period of ten years after the high-risk AI system has been placed on the market or put into service.
The technical documentation requirements set forth in article 11 of the AI Act echo the principles of the EU declaration of conformity and CE marking for products, emphasizing the need for comprehensive documentation to demonstrate compliance with regulatory standards, and enabling efficient monitoring of operations and post market monitoring. Similarly, the technical documentation serves as a means for AI providers to show adherence to the AI Act's rigorous requirements. Furthermore, the focus on describing the development process, risk management systems, and post-market monitoring also aligns with broader trends in AI governance and GDPR.
Therefore, preparing the technical documentations should be carried out in conjunction (or at least in a consistent manner) with the AI governance policy and the documentation regarding the CE mark. For high-risk AI systems related to a product covered by EU harmonisation legislation listed in Section A of Annex I, such as for instance medical devices, toys or machinery, a single set of technical documentation can be drawn up to combine the information required under the AI Act and the respective EU legislation. This reflects the AI Act’s approach to ensure consistency, avoid duplication and minimise additional burdens for providers that have to undergo the conformity assessment and comply with the obligations and the existing EU legislation providing for the requirements for their products.
Compliance with the above involves careful planning and execution by AI providers. Here are some key steps that should be considered:
Authored by Martin Pflueger, Juan Ramón Robles, and Cristina Baron.