尊龙凯时人生就是搏

尊龙凯时人生就是搏

Konzeptionelle Technologie anhand einer Illustration von kuenstlicher Intelligenz
kras99 / stock.adobe.com
2025-05-28 expert contribution

Challenges for AI-based medical devices due to the EU Artificial Intelligence Act (AIA)聽


More than 3 years after the draft was presented, the European Artificial Intelligence Act (AIA) was published in the Official Journal of the European Union on July 12, 2024 as Regulation (EU) 2024/1689. The AIA entered into force on August 1, 2024 and from August 2, 2026, most of the provisions of the AIA will apply to AI systems in the EU (Art. 113 AIA). Individual parts of the AIA have a different date of application:聽

Contact
Dr. Thorsten Prinz
Downloads + Links
  • Chapter I (General Provisions) and Chapter II (Prohibited AI Practices) from February 2, 2025,聽聽
  • Section 4 (Notifying Authorities and Notified Bodies) in Chapter III (High-Risk AI Systems), Chapter V (General-Purpose AI Models), Chapter VII (Governance) and Chapter XII (Penalties) with the exception of Art. 101 AIA (Fines for Providers of General-Purpose AI Models) and Art. 78 AIA (Confidentiality) from August 2, 2025 and聽聽
  • Art. 6 (1) AIA (Classification rules for high-risk AI systems) and the corresponding obligations from August 2, 2027.聽

The AIA creates a uniform and horizontally effective legal framework, particularly for the development, marketing and use of artificial intelligence. The 144-page regulation is divided into 113 articles and 13 annexes.聽The implementation of the AIA will require numerous activities in the member states and on the part of the EU Commission in the coming years.聽

According to Art. 6 (1) AIA, AI systems are classified as high-risk AI systems as stand-alone products or safety components of products that are subject to Regulation (EU) 2017/745 on medical devices (Medical Device Regulation, MDR) or Regulation (EU) 2017/746 on in-vitro diagnostics (In-vitro Diagnostics Regulation, IVDR), as well as conformity assessment by third parties. Put simply, these are all AI-based medical devices in risk classes IIa-III and in vitro diagnostics in risk classes B-D.聽聽聽

In the following, we provide an overview of the planned content and explain the impact of the AIA on medical device manufacturers.聽

Definitions and legal scope of application聽

According to Art. 3 (1) AIA, an AI system is defined as 鈥渁 machine-based system that is designed to operate with varying levels of autonomy and that may exhibit adaptiveness after deployment, and that, for explicit or implicit objectives, infers, from the input it receives, how to generate outputs such as predictions, content, recommendations, or decisions that can influence physical or virtual environments鈥. The definition was recently explained by the EU Commission in the document 鈥淐ommission Guidelines on the definition of an artificial intelligence system established by Regulation (EU) 2024/1689 (AI Act)鈥.

The AIA applies to the following groups and persons (Art. 2 AIA):聽

  • providers placing on the market or putting into service AI systems or placing on the market general-purpose AI models in the Union, irrespective of whether those providers are established or located within the Union or in a third country;聽
  • deployers of AI systems that have their place of establishment or are located within the Union;聽
  • providers and deployers of AI systems that have their place of establishment or are located in a third country, where the output produced by the AI system is used in the Union;聽
  • importers and distributors of AI systems;聽
  • product manufacturers placing on the market or putting into service an AI system together with their product and under their own name or trademark;聽
  • authorised representatives of providers, which are not established in the Union;聽
  • affected persons that are located in the Union. The other paragraphs contain restrictions on the scope of application, e.g. for military purposes.聽聽聽

In the context of the scope of application, the term 鈥減rovider鈥 is used instead of 鈥渕anufacturer鈥, and this is defined in Art. 3 (3) AIA as follows: 鈥渁 natural or legal person, public authority, agency or other body that develops an AI system or a general-purpose AI model or that has an AI system or a general-purpose AI model developed and places it on the market or puts the AI system into service under its own name or trademark, whether for payment or free of charge鈥. Furthermore, the 鈥渄eployer鈥 in the sense of a user is defined as a 鈥渘atural or legal person, public authority, agency or other body using an AI system under its authority except where the AI system is used in the course of a personal non-professional activity鈥 (Art. 3 (4) AIA). Consequently, both medical device manufacturers and clinics, doctors or other users of AI systems are affected by the AIA.

Distributors, importers, operators or other third parties can also become suppliers if, for example, they make significant changes to a high-risk AI system that has already been placed on the market or put into operation. Further details are set out in Art. 25 AIA.聽聽

Requirements in relation to medical devices聽

In contrast to the MDR and IVDR, the AIA does not contain the product requirements in an annex, but describes them in detail in the legal text. Providers of high-risk AI systems are generally subject to the provisions of Art. 16 AIA. The following table lists the requirements that apply to medical devices as high-risk devices within the meaning of the AIA and - if available - the respective equivalents in the MDR:聽聽

Requirements聽

Reference AIA聽

Reference MDR聽

Degree of consistency between AIA and MDR聽

Risk management system聽

Art. 9聽

Art. 10 (2)聽

mainly聽

Data and data governance聽

Art. 10聽

--聽

No聽

Technical documentation聽

Art. 11聽

Art. 10 (4), Annex II/III聽

Partially聽

Mandatory automatic recording during operation聽

Art. 12聽

--聽

No聽

Transparency and provision of information to deployers聽

Art. 13聽

Art. 10 (11)聽

Partially in relation to instructions for use聽

Mandatory human oversight (human-machine interface)聽

Art. 14聽

--聽

Partially聽

Accuracy, robustness and cybersecurity聽

Art. 15聽

Annex II/III聽

Partially聽

Labeling provisions聽

---聽

Annex I聽

No聽

Quality management system聽

Art. 17聽

Art. 10 (9)聽

Partially聽

Documentation keeping聽

---聽

Art. 10 (8)聽

Partially聽

Automatic generation of logs聽

Art. 19聽

--聽

No聽

Conformity assessment聽

Art. 43聽

Art. 10 (6)聽

Partially聽

EU declaration of conformity聽

Art. 47聽

Art. 10 (6)聽

Partially聽

CE marking聽

Art. 48聽

Art. 10 (6)聽

Mainly聽

Registration obligations聽

Art. 49 (1)聽

Art. 10 (7)聽

Partially聽

Necessary corrective actions and resp. information聽

Art. 20聽

Art. 10 (12)聽

Mainly聽

Demonstration of conformity towards national competent authority聽

---聽

Art. 10 (14)聽

Partially聽

Additional obligations聽are existing for deployers聽

Art. 26聽

---聽

No聽

Additional obligations聽are existing for surveillance authorities聽

Art. 74聽

Art. 93聽

partly with the exception of the extensive powers under the AIA聽

Post-market surveillance and vigilance (suppliers)聽

Art. 72, 73聽

Art. 10 (10, 12, 13)聽

Mainly聽


Providers of AI-based medical devices have the option of integrating the implementation of the provision of the AIA into their respective MDR processes and documentation (Art. 8 (2) AIA).聽

In principle, providers of high-risk AI systems must establish and maintain a risk management and quality management system (QMS) (Art. 9 and 17 AIA). For medical devices and IVDs, the respective requirements can be implemented in existing systems.

Art. 10 AIA specifies data and data governance requirements and Art. 15 AIA essentially relates to the development and technical evaluation of AI systems. The contents of both articles largely overlap with the requirements from the joint questionnaire 鈥淎rtificial Intelligence (AI) in Medical Devices鈥 IG-NB and Team-NB. It is therefore all the more important to consistently implement the requirements of the questionnaire as a provider (manufacturer) and thus also be prepared for the AIA (AIA Ready).聽However, some of the requirements in the questionnaire go beyond the legal requirements, which are ultimately decisive.

For AI-based medical devices, the technical documentation to demonstrate compliance with the legal requirements must comply with both Annex IV AIA and Annexes II and III MDR and must be kept available by the provider for a certain period of time (Art. 11, 18).聽For SMEs and start-ups, a simplified provision of technical documentation in accordance with Annex IV AIA is provided for (Art. 11 (1)). However, a form must be used for this, which has yet to be provided by the European Commission. For medical devices and IVDs, a standardized technical documentation is to be created that simultaneously meets the requirements of the sector-specific legal acts and the AIA (Art. 11 (2)).

The safety of AI-based medical devices is the joint responsibility of providers and users. The AIA takes this into account with the following requirements:

  • Providers of high-risk AI systems must implement an 鈥渁utomatic recording of events [...] over the lifetime of the system鈥 (logging) to ensure the traceability of system functions (processes and events) and post-market surveillance (Art. 12 AIA).聽
  • In Art. 13 AIA, the legislator provides for extensive transparency obligations. For 鈥淎I systems intended to interact directly with natural persons鈥, it is also required that the 鈥渘atural persons concerned are informed that they are interacting with an AI system鈥 (Art. 50 (1) AIA). Mentioning AI technology as an operating principle in the intended purpose of a medical device should fulfill this requirement.聽A procedure for fulfilling transparency obligations was recently published (Prinz, 2024 and poster in the download area of this blog post).聽
  • Art. 26 formulates separate requirements for users and Art. 14 AIA requires mandatory human supervision.

The legislator provides for certain retention periods for the provider for all documentation (Art. 18 AIA). This also applies to the automatically generated logs (Art. 19 (1) AIA).

The application of harmonized standards (Art. 40 (1) AIA) by the provider of AI systems triggers a presumption of conformity with the respective legal requirements. A corresponding provision can also be found in Art. 8 MDR. In addition, the EU Commission reserves the right to issue common specifications (Art. 41 AIA), as already provided for in Art. 9 of the MDR. The published 鈥淪tandardization request to the European Committee for Standardisation and the European Committee for Electrotechnical Standardisation in support of Union policy on artificial intelligence鈥 will ensure the creation of the corresponding standards.聽

In principle, providers of high-risk AI systems must undergo a conformity assessment procedure based on an internal control (Annex VI AIA) or the assessment of the quality management system and the technical documentation by a Notified Body (Annex VII AIA) (Art. 43 (1)). The provider of high-risk AI systems is only permitted to use the less complex first procedure if it complies with harmonized standards and, where applicable, common specifications. If the high-risk AI system is a medical device, the manufacturer must undergo the relevant conformity assessment procedures in accordance with the MDR and include the specified requirements for high-risk AI systems in the assessment (Art. 43 (3)). Points 4.3, 4.4 and 4.5 as well as point 4.6 paragraph 5 of Annex VII must also be applied. The handling of changes to high-risk AI systems that have already undergone a conformity assessment procedure is regulated in Art. 43 (4) AIA. In the event of a significant change, these 鈥渟hall be subject to a new conformity assessment procedure, irrespective of whether the modified system is to be placed on the market or continued to be used by the current operator鈥. It also states: 鈥淔or high-risk AI systems that continue to learn after being placed on the market or put into service, changes to the high-risk AI system and its performance that were pre-determined by the provider at the time of the initial conformity assessment and are included in the information in the technical documentation referred to in point 2(f) of Annex IV shall not be considered a substantial change鈥. Providers of these AI systems must pay particular attention to eliminating or minimizing the risk of "potentially biased spending [...] and ensuring that such feedback loops are adequately addressed with appropriate risk mitigation measures" (Art. 15 (4) AIA). The determination of the changes to the high-risk AI system and its performance at the time of the initial conformity assessment and the identification of associated risks could be carried out according to the procedure recently presented in a VDE-DGBMT recommendation.聽

For high-risk AI systems as a medical device, the provider issues a single EU declaration of conformity 鈥渟hall be drawn up in respect of all Union law applicable to the high-risk AI system鈥 (example: in the case of software as a medical device, to MDR and AIA). The declaration contains 鈥渁ll the information required to identify the Union harmonisation legislation to which the declaration relates鈥 (Art. 47 (3) AIA).聽聽

Similar to the regulatory life cycle of medical devices, the provider obligations for AI systems do not end with the placing on the market, but continue in the context of post-market surveillance and vigilance. Accordingly, Art. 72 AIA requires the creation of a post-market surveillance plan and the reporting of serious incidents and malfunctions by the provider to the competent authorities (Art. 73 AIA).聽

Applicability to legacy high-risk AI systems聽聽

High-risk AI systems that were placed on the market or put into operation before the date of application of the AIA only have to comply with the provisions of the AIA if 鈥渢hose systems are subject to significant changes in their designs鈥 (Art. 111 (2)).聽聽

General purpose AI models聽

A general purpose AI model (GPAI) is 鈥渁n AI system which is based on a general-purpose AI model and which has the capability to serve a variety of purposes, both for direct use as well as for integration in other AI systems鈥 (Art. 3 (66) AIA). Section 2 of Chapter V AIA contains general obligations for GPAI providers, namely the creation of technical documentation, the provision of information to other manufacturers who wish to integrate a GPAI model into their own AI system, the creation of a copyright policy based on the relevant legislation and the publication of a summary of the training data used. In addition, for GPAI models with systemic risks, assessments including attack testing, cybersecurity measures, self-assessment and mitigation of systemic risks, and reporting of serious incidents are required.聽

AI real laboratories聽

The establishment of AI laboratories is intended to enable manufacturers to develop, train, test and validate AI systems for a certain period of time before they are launched on the market (Art. 57 AIA). This is intended to promote innovation and competitiveness and facilitate market access for start-ups and small and medium-sized enterprises (SMEs). In addition, under certain conditions, providers can test their systems under real-life conditions outside the AI real laboratories. These conditions include the granting of an authorization by the competent national authority on the basis of a test plan submitted in advance by the manufacturer. Future implementing acts of the European Commission are expected to define the conditions under which real-world test facilities will be introduced in the individual member states (Art. 58 AIA).聽聽

Summary and recommendations聽聽

Manufacturers of AI-based medical devices must quickly take into account the increased effort in the technical documentation and the extended conformity assessment in terms of both organizational and financial resources. In this regard, we recommend participating in our hands-on training course 鈥淎rtificial Intelligence (AI) in Medical Devices鈥, which we are also happy to conduct as an in-house event at your premises. Furthermore, any relevant guidelines for the AIA and MDR that are published in the future should be observed.聽

百度一下 搜索 尊龙凯时人生就是搏