Navigating the Dual Regulatory Framework: MDR/IVDR
and the EU Artificial Intelligence Act
For manufacturers of AI-enabled medical devices in the European market, understanding the interplay between the Medical Devices Regulation (MDR), In
Vitro Diagnostic Regulation (IVDR), and the new Artificial Intelligence Act (AIA) is critical for successful market access. MDCG 2025-6 provides essential
guidance on meeting these dual compliance obligations.
Download MDCG 2025-6
Request Consultation
Understanding MDCG 2025-6: A Joint Regulatory Framework
In June 2025, the Medical Device Coordination Group (MDCG) and the Artificial Intelligence Board (AIB) jointly endorsed MDCG 2025-6, a landmark
guidance document that clarifies the complex intersection between existing medical device regulations and the new AI-specific requirements.
This FAQ-style guidance addresses a critical regulatory gap: while the MDR (2017/745) and IVDR (2017/746) already regulate software with a medical
purpose, they don't explicitly address AI-specific risks such as continuous learning, algorithmic bias, or impacts on fundamental rights. The Artificial
Intelligence Act (Regulation (EU) 2024/1689) complements these regulations by imposing targeted obligations specifically for high-risk AI systems.
1
MDR/IVDR Focus
Safety, performance, and clinical evidence standards for medical
devices, including software as a medical device
2
AIA Focus
AI-specific risk management, transparency, data governance, and
human oversight requirements
The dual framework creates the world's most advanced regulatory ecosystem for AI-enabled medical technologies, prioritizing patient safety while
fostering innovation.
Scope and High-Risk Classification
Understanding when and how the Artificial Intelligence Act applies to
medical devices is the first critical step for manufacturers. According to
MDCG 2025-6, the AIA applies when software qualifies as a medical
device AI system under MDR/IVDR.
Article 6(1) of the AIA designates a medical device AI system as high-risk
if it is:
A safety component of a medical device, or a medical device itself
Subject to third-party conformity assessment under MDR/IVDR
Important Note: The high-risk designation under the AIA does
not change the MDR/IVDR risk classification of the device.
Instead, the MDR/IVDR classification determines whether the
device falls within the high-risk scope of the AI Act.
This means manufacturers must first determine their device's MDR/IVDR
classification, which then informs whether additional AIA requirements
apply. This approach ensures regulatory coherence while adding AI-
specific safeguards.
Integrated Management Systems and Risk Assessment
One of the most significant challenges for manufacturers is implementing the dual regulatory requirements efficiently. MDCG 2025-6 provides a
pragmatic approach that allows for integration rather than duplication.
MDR/IVDR QMS
Lifecycle safety management, quality system
requirements, and vigilance obligations
AIA Requirements
Continuous risk management covering health,
safety, and fundamental rights impacts
Integrated Approach
Manufacturers may incorporate AIA
requirements into existing MDR/IVDR quality
management systems
The guidance emphasizes that manufacturers should expand their existing risk management processes to include AI-specific considerations such as
algorithmic bias, transparency, and impacts on fundamental rights. This integrated approach reduces redundancy while ensuring comprehensive coverage
of all regulatory requirements.
Companies should document how their management systems address both sets of requirements, with particular attention to the points of intersection
and any AI-specific controls implemented.
Data Governance and Bias Mitigation
Data quality and governance are cornerstones of both regulatory
frameworks, but the AIA introduces more stringent and specific
requirements focused on bias prevention and dataset management.
While MDR/IVDR already require robust and representative clinical data,
the Artificial Intelligence Act adds explicit obligations related to:
Training, validation, and testing datasets quality
Freedom from discriminatory bias
Transparent data governance practices
Continuous monitoring for data drift
Manufacturers must establish comprehensive data governance protocols that track data provenance, document preprocessing techniques, and validate
dataset representativeness. MDCG 2025-6 emphasizes that these processes should be integrated into the broader clinical or performance evaluation
strategies required by MDR/IVDR.
Post-market, continuous monitoring and logging are mandated to detect any data drift or emerging bias that could affect device performance or safety.
This ongoing vigilance represents a significant advancement in medical device regulation.
Technical Documentation Requirements
Creating comprehensive technical documentation that satisfies both regulatory frameworks represents a significant challenge for manufacturers. MDCG
2025-6 clarifies that a single, integrated technical file should be maintained.
MDR/IVDR Documentation
Software design and architecture
Risk management file
Performance validation
Clinical evidence
Post-market surveillance plan
Additional AIA Documentation (Article 11)
AI-specific risk assessments
Data governance documentation
Transparency measures
Human oversight mechanisms
Bias monitoring protocols
The guidance recommends organizing documentation to clearly demonstrate compliance with both sets of requirements, with cross-references where
appropriate to avoid duplication. Notified Bodies will assess this comprehensive documentation as part of a single conformity assessment procedure.
Manufacturers should develop standardized templates that incorporate all required elements from both regulations to streamline documentation
processes and ensure nothing is overlooked.
Transparency, Human Oversight, and Usability
The Artificial Intelligence Act introduces heightened requirements for transparency and human oversight that complement the usability engineering
requirements already present in MDR/IVDR.
According to MDCG 2025-6, manufacturers must:
Design systems that allow healthcare professionals to correctly interpret AI outputs
Inform users when they are interacting with an AI system (unless obvious)
Provide clear instructions regarding AI limitations and potential risks
Implement human oversight mechanisms appropriate to the clinical context
These transparency requirements must be incorporated into the device design and user interface, not just documented in the instructions for use.
Manufacturers should conduct specific usability studies to verify that healthcare professionals can appropriately understand and, when necessary,
override AI-generated recommendations.
Together, MDR/IVDR and the AIA establish a comprehensive framework that ensures AI systems remain transparent tools that augment, rather than
replace, clinical judgment.
Cybersecurity and Robustness Requirements
Cybersecurity has become increasingly critical for medical devices, and the dual regulatory framework significantly strengthens these requirements,
particularly for AI-enabled devices.
MDR/IVDR Cybersecurity
Protection against unauthorized access and conventional
cybersecurity risks as part of essential safety requirements
AIA-Specific Cybersecurity
Additional obligations to address AI-specific vulnerabilities such as:
Adversarial attacks targeting the AI model
Model poisoning during training
Data manipulation that could skew outputs
Resilience to unexpected inputs
MDCG 2025-6 emphasizes that cybersecurity considerations must be integrated into both quality management systems and risk management processes.
Manufacturers should conduct specific testing for AI robustness, including adversarial testing to verify that the system remains safe even when faced
with unexpected or manipulated inputs.
The guidance recommends ongoing monitoring for emerging AI-specific vulnerabilities and establishing mechanisms for rapid security updates when
necessary, while maintaining compliance with change management requirements.
Clinical/Performance Evaluation and Conformity Assessment
Comprehensive Evaluation Approach
The MDR/IVDR framework requires thorough clinical evaluation (MDR) or
performance evaluation (IVDR) to demonstrate device safety and
effectiveness. The AIA enhances these requirements with obligations to
validate:
AI pipeline integrity and reproducibility
Algorithm robustness across varied inputs
Transparency of decision-making processes
Impacts on fundamental rights and potential biases
Adaptive AI and Substantial Modifications
A critical consideration for AI-enabled medical devices is how to handle
updates and learning algorithms. MDCG 2025-6 clarifies that:
Manufacturers must prepare pre-determined change control plans for
adaptive AI systems
Substantial modifications under AIA Article 3(23) require a new
conformity assessment unless changes were pre-approved
The conformity assessment procedure is defined by MDR/IVDR but
must incorporate relevant AIA requirements
This approach allows for innovation through adaptive AI while maintaining regulatory oversight. Manufacturers should work closely with Notified Bodies
to establish clear boundaries for permitted adaptations versus those requiring reassessment.
Post-Market Monitoring and Strategic Recommendations
Post-market surveillance takes on heightened importance under the dual regulatory framework, with additional AI-specific monitoring requirements
layered onto existing MDR/IVDR obligations.
Enhanced Monitoring Requirements
The AIA mandates monitoring of interactions between multiple AI
systems and requires manufacturers to implement comprehensive
logging capabilities. The European Commission will publish a
harmonized post-market monitoring template by 2026.
Strategic Recommendations for Manufacturers
To successfully navigate this complex regulatory landscape,
manufacturers should:
Integrate AIA obligations into existing MDR/IVDR quality
management systems
Develop comprehensive documentation aligned with both
regulatory regimes
Establish pre-determined change control protocols for adaptive
AI systems
Strengthen cybersecurity and data governance with AI-specific
controls
Invest in AI literacy and training as required under AIA Article 4
While complex, this dual framework ultimately reinforces patient safety, transparency, and trust in AI-driven medical technologies. Manufacturers who
embrace these requirements as opportunities for differentiation rather than mere compliance burdens will be well-positioned to lead in this evolving
regulatory landscape.
Schedule Regulatory Strategy Consultation
