Insights and Analysis

MDCG published new guidance on the interplay between the MDR & IVDR and the AI Act

microscope
microscope

On 19 June 2025, the Medical Device Coordination Group (“MDCG”) published new guidance on the interplay between the MDR & IVDR and the AI Act (MDCG 2025-6). The document provides a first set of answers to the most frequently asked questions related to the joint application of the AI Act and the MDR and IVDR for manufacturers of AI systems used for medical purposes (“MDAI”).

We have summarised below the key takeaways from this Q&A.

The MDCG guidance clarifies a few points:

  • Manufacturer vs provider and user vs deployer: The MDCG clarifies that the references to “manufacturer” within the meaning of the MDR/IVDR should be interpreted as referring to “provider” in accordance with the AI Act. It also explains that the concept of “deployer” under the AI Act should not be understood as referring to “user” under the MDR/IVDR (healthcare professionals and lay persons who use the device).
  • Annex XVI devices (products without an intended medical purpose): The MDCG confirms that the AI Act could also apply to Annex XVI products and accessories.
  • Distinction between high-risk AI system under the AI Act and the high-risk device under the MDR/IVDR: The MDCG confirms that the classification of an AI system as a high-risk under the AI Act does not imply that the medical device or the IVD falls in a higher risk class under the MDR/IVDR.

The MDCG document also provides key guidance on:

  • Scope of application of the AI Act and its interplay with the MDR/IVDR. A MDAI is considered a high-risk AI system under Article 6(1) of the AI Act if it meets both of the following conditions:
    • the MDAI is a safety component, or the AI system is itself a medical device and
    • the MDAI is subject to a third-party conformity assessment by a notified body in accordance with the MDR/IVDR

    The MDCG provides this table to illustrate which devices fall under the scope of Article 6(1) of the AI Act.

    Classification

    Notified Body Involved?

    AI Act High-Risk (Art. 6(1)) conditions fulfilled?

    MDR Class I (non-sterile, non-measuring, nonreusable surgical)

    No

    No

    MDR Class I (sterile, measuring, reusable, surgical)

    Yes

    Yes

    MDR Class IIa, IIb, III

    Yes

    Yes

    MDR Annex XVI

    Yes

    Yes

    IVDR Class A (non-sterile)

    No

    No

    IVDR Class A (sterile)

    Yes

    Yes

    IVDR Class B, C, D

    Yes

    Yes

    In-house device according to Art. 5(5) MDR*

    No

    No

*The MDCG document clarifies that in-house devices are not high-risk AI systems under the Article 6 (1) of the AI Act unless assessed by a notified body. C

  • Management systems:
    • Life cycle management. The MDR/IVDR requires manufacturers to ensure that their MDAIs remain safe and performant throughout their use. The AI Act reinforces this by requiring continuous oversight and post-market monitoring of high-risk AI systems. For example, this includes design choices allowing natural persons to oversee functioning of high-risk MDAI and to ensure that they are used as intended and that their impacts are addressed over the system’s lifecycle.
    • QMS. The MDCG encourages manufacturers to include the elements of the quality management system provided by the AI Act as part of the existing quality management system provided by the MDR/IVDR. The AI Act requirements are complementary to the quality management system required under the MDR/IVDR and specifically target the AI system. Additional requirements, such as data governance, record-keeping, transparency, human oversight must be integrated, as appropriate (non-exhaustive).
    • Risk management. Both the MDR/IVDR and the AI Act require a risk management system that is a continuous iterative process throughout the entire lifecycle of the device, including in both the pre-market and post-market phases. The MDCG encourages manufacturers of high-risk MDAI to integrate the additional risk management requirements specific to MDAI into the testing and reporting processes, information and documentation required under the AI Act into their existing documentation and procedures under the MDR/ IVDR.
  • Data Governance:
    • Data sets. The MDCG provides that high-quality data sets for training, validation and testing of MDAI require the implementation of appropriate data governance and management practices. According to the MDCG guidance, the requirements related to data governance can be complied with by having recourse to third parties that offer certified compliance services including verification of data governance. When training, validating and testing high-risk MDAI, manufacturers must ensure that the datasets are sufficiently representative of the target population, and in relation to the performance study population specifications on selection criteria and decisions related to the size of the performance study population are defined.
    • Unwanted biases. Manufacturers should implement procedures ensuring data transparency and integrity and examine the data in view of biases. This includes addressing potential biases that could compromise health and safety, infringe on fundamental rights, or lead to unlawful discrimination, especially in systems where output data may influence future operations. Manufacturers must put in place measures to detect, prevent, and mitigate such biases. Additionally, the AI Act requires technical capabilities for automatic logging of events throughout the high-risk MDAI’s lifecycle. These logging and record-keeping requirements support traceability and help identify risks, such as bias arising from training, validation, or testing datasets, either during initial development or due to significant modifications post-deployment.
    • Training, validation and testing data used for high-risk MDAI. The MDCG document outlined that the validation of training data used for MDAI is critical and should be demonstrated as part of the studies to ensure the accuracy, reliability, and effectiveness of the high-risk MDAI.
  • Technical Documentation: Both the MDR/IVDR and the AI Act require the provision of comprehensive technical documentation for high-risk MDAI. The MDCG recommends a single set of technical documentation for high-risk MDAI. Moreover, the MDCG document also provides that sampling rules of the governing conformity assessment procedure (laid down in MDCG 2019-13 (Guidance on sampling of MDR Class IIa / Class IIb and IVDR Class B / Class C devices for the assessment of the technical documentation) remain applicable.
  • Transparency and human oversight:
    • Framework. The MDCG provides that the MDR/IVDR and the AI Act collectively support a coherent regulatory framework that ensures MDAIs are designed, documented, and deployed in a transparent and explainable manner. According to the MDCG, transparency contributes to explainability which in turn facilitates accountability.
    • User-centric design. The MDCG recommends that AI systems, especially those in healthcare, be designed with user-centric principles. Documentation of usability engineering processes and outcomes is required under both the MDR/IVDR and the AI Act.
    • Human oversight. The MDCG also notes that appropriate human oversight should be viewed as a risk management measure. Manufacturers are expected to follow this approach by order of priority: (a) eliminate or minimise risks as far as possible through safe design and manufacturing, (b) implement protective measures, such as alarms, where risks cannot be fully eliminated, and (c) provide relevant safety information. Additionally the MDCG provides that manufacturers of MDAI need to carefully consider how much human oversight is needed, based on the level of risk of the device. For example, in robotic-assisted surgery, a healthcare professional might supervise the procedure and be able to step in or override the AI system if needed. However, the software should not allow human intervention in very critical moments if doing so could put the patient at greater risk. These oversight measures must be clearly thought through and included in the device’s risk assessment and safety planning.
  • Accuracy, Robustness and Cybersecurity: The MDCG clarifies howto implement cybersecurity by design and how AI-specific threats (e.g., model poisoning, adversarial attacks) must be addressed.
  • Clinical/performance Evaluation and Testing: The MDCG clarifies that although the AI Act does not specifically use the terms “clinical evaluation” or “performance evaluation”, it provides criteria such as accuracy, robustness, and cybersecurity for high-risk MDAI. These criteria are integral components of performance evaluation under the MDR/IVDR. The MDCG guidance also explains that where high-risk MDAI undergoes a clinical investigation or performance study under the MDR/IVDR, this also qualifies as real-world testing under the AI Act (Article 60). Even though the AI Act does not apply to AI systems that are not yet placed on the market or put into service, it introduces a specific provision for Annex III high-risk AI systems which provides that under certain conditions, these systems may undergo testing in real-world conditions prior to their placing on the market or putting into service. The AI Act explicitly provides that such real-world testing is permitted "without prejudice to Union or national law on the testing in real-world conditions of high-risk AI systems that are medical devices or in vitro diagnostic medical devices”.
  • Conformity assessment: The MDCG clarifies when to apply which conformity assessment procedures and explains interactions between AI Act and MDR/IVDR assessments, especially in cases of overlapping conformity obligations. For example, if an AI system used for healthcare, such as an emergency triage system, qualifies as MDAI, then the conformity assessment procedure pursuant to Article 6 (1) of the AI Act is applicable (see above). AI systems that qualify as high-risk AI systems under Article 6(1) of AI Act must comply with both MDR/IVDR and the AI Act.
  • Substantial modifications/significant change:
    • According to the MDCG, changes to high-risk MDAIs that have been pre-determined by the manufacturer and that are assessed at the moment of the initial conformity assessment and that are part of the information contained in the technical documentation referred to in the AI Act Annex IV point 2(f) does not constitute a substantial modification under the AI Act. The pre-determined changes should not be understood as a change to the certified medical device or IVD under MDR Annex IX Section 4.10 and IVDR Annex IX Section 4.11. It is essential that at the time of conformity assessment, the pre-determined changes must be clearly specified and adaptable to the device's evolving nature. This performance, documented in the technical documentation, should include a detailed description of the initial performance expectations alongside mechanisms for validating and managing changes that occur post-market.
    • The MDCG also provide guidance on scenarios where high-risk MDIA that are already placed on the market will undergo significant design changes before the end of the transitional period (2 August 2027). According to the MDCG,
      • If the medical device has been placed on the market/put into service before 2 August 2027:
        • If the AI system is subject to any significant changes in its design onor after 2 August 2027, the AI Act requirements apply, including obligations for “Annex I high-risk AI systems”.
        • If the AI system is subject to any significant changes in its design before 2 August 2027, the obligations for “Annex I high-risk AI systems” do not yet apply.
      • If the medical device is placed on the market/put into service on or after 2 August 2027, then the AI Act requirements apply, including obligations for “Annex I high-risk AI systems”.
  • Post-market monitoring: Both MDR/IVDR and the AI Act provide obligations on the manufacturer to establish and document a post-market monitoring system that is risk-based and clearly established in the quality management system that the device continues to operate as intended. The new requirements introduced by the AI Act include, where applicable, the need to detect interactions with other AI systems, such as devices and software, and the obligation for deployers to monitor the system’s operation and, where relevant, inform the manufacturer.

Please contact our team if you have any questions about this guidance and how it applies to your AI system.

 

Authored by Hélène Boland and Fabien Roy. 

View more insights and analysis

Register now to receive personalized content and more!