Appointment of a supervisor with technical, clinical, and ethical expertise.
Cyber Resilience Act vs. medical devices: An exception with built-in risk?
The Cyber Resilience Act (CRA, Regulation (EU) 2024/2847) forms the new horizontal EU legal framework for the cybersecurity of products with digital elements. According to the wording of the Act, the CRA applies in principle to all networked or software-based products placed on the market or put into service in the EU. At the same time, however, Article 2(2) of the Regulation contains a key exemption provision: according to this, the CRA does not apply to products that fall within the scope of more specific EU legal acts, provided that these legal acts already lay down cybersecurity requirements. These specific legal frameworks explicitly include medical devices under Regulation (EU) 2017/745 (MDR) and in vitro diagnostic medical devices under Regulation (EU) 2017/746 (IVDR). The explanatory memorandum and the text of the regulation make it clear that medical devices are excluded from the horizontal CRA due to their already dense regulatory framework, including requirements for software development, risk management, information security, and post-market surveillance.
At the same time, Article 2(5) of the CRA contains a provision that opens up an area of tension: it stipulates that the CRA shall not apply to products covered by other EU legislation only if the sectoral regulation ensures “the same or a higher level of protection” and there is no regulatory conflict. This passage makes it clear that the exclusion is not completely automatic and indiscriminate, but requires a substantive legal assessment of whether the respective requirements of the special legal framework actually cover the protection objectives of the CRA in full. While Article 2(2) formally provides for a clear exclusion, Article 2(5) relativizes this by means of a function-related consistency check: as soon as digital components, remote services, or software-based functions of a medical device touch on aspects that are regulated in more detail in the CRA than in the MDR or IVDR, the question arises as to whether CRA requirements could nevertheless remain relevant for these elements.
This results in a double tension: On the one hand, Art. 2 (2) establishes a clear legislative priority for medical device law, which means that medical devices as a category are formally excluded from the CRA. On the other hand, Art. 2 (5) requires a substantive equivalence assessment of the relevant cyber protection levels, which in practice is particularly relevant for networked or cloud-based medical devices. Modern software as a medical device (SaMD), digital health applications, AI-supported diagnostic systems, or devices with remote update mechanisms increasingly have technical features that are more structured and standardized in the CRA than in the MDR regime, for example with regard to vulnerability handling, SBOM obligations, or cybersecurity support periods.
For manufacturers, this means that although medical devices remain formally excluded from the scope of application on the basis of Article 2(2) CRA, it must be verified in light of Article 2(5) and, if necessary, demonstrated that the MDR/IVDR obligations fully achieve the CRA level of protection. Where this is not guaranteed, a regulatory gray area arises, which in practice must be closed by technical documentation, internal risk analyses, and possibly supplementary vulnerability management processes. Despite the formal exemption, the CRA thus opens up a new field of analytical obligation: manufacturers must actively demonstrate that their products already offer a level of protection that complies with the CRA due to sectoral regulation, particularly in those digital functional areas that go beyond classic medical device requirements.
The CRA does not mark an immediate paradigm shift for medical devices, but it does mark a shift toward greater transparency and cyber resilience documentation as soon as technological or functional overlaps arise. The combination of formal exemption under Art. 2(2) and substantive consistency testing under Art. 2(5) makes the CRA a relevant reference framework, especially for modern, networked, and software-dominated medical devices that increasingly exhibit characteristics addressed in detail by the CRA.
EU tightens rules on toy safety – impulses for the regulation of medical devices?
The European Union is significantly tightening its requirements for toy safety. The reasons for this include the growth of online trade, including via suppliers outside the EU, and the increasing use of digital technology in toys. Member States have agreed on new regulations that are intended to better protect children from chemical, physical, and digital risks and to strengthen market surveillance.
One focus of the new rules is on stricter requirements for hazardous chemicals and comprehensive safety assessments. The existing ban on carcinogenic, mutagenic, and reprotoxic substances will be extended to include other particularly harmful substances, accompanied by additional restrictions on allergenic fragrances. In future, manufacturers will have to assess all relevant risks before placing products on the market – from chemical, physical, mechanical, and electrical hazards to flammability, hygiene, and radioactivity, and even whether digital toys could impair children’s mental development.
One particularly forward-looking innovation is the introduction of a digital product passport. In the future, every toy will carry digital proof of compliance with all safety regulations, which can be accessed via a QR code. This not only makes it easier for consumers to access safety-related information, but also improves traceability, market surveillance, and customs control. Authorities should be able to identify more quickly whether a product is correctly labeled, where it was manufactured, and what risks it poses.
Parallels with medical devices: a possible guide for the industry
The introduction of the digital product passport in the toy sector raises the question of whether a comparable instrument would also be useful for other regulated product categories, in particular medical devices. Although these are already subject to a strict legal framework, the increasing penetration of software, the connection to digital services, the use of AI components, and globalized supply chains are increasing the need for transparency and traceability.
A digital product passport could offer several advantages for medical devices. In conjunction with existing requirements such as the UDI system and EUDAMED, traceability could be further improved and market surveillance made more efficient. Customs authorities and regulatory agencies would have immediate access to safety-related product information, changes, and recalls via a central digital portal. At the same time, patients, users, and healthcare facilities could access instructions for use, risk warnings, and current safety information directly via a QR code on the packaging or on the product itself—a clear added value, especially for frequently updated software.
A digital product passport would also offer potential from the perspective of the post-market surveillance system: Manufacturers could use it to provide software versions, safety notices, corrective measures, and status information in a structured manner. In light of new horizontal cybersecurity requirements such as the Cyber Resilience Act, which additionally addresses certain digital components, such a passport could also help to make information on software bills of materials, security updates, and conformity certificates available in a bundled form.
Finally, global supply chains for medical devices would also be easier to map. While toy manufacturers will be required to provide digital product passports in the future, medical device manufacturers could use a similar concept to transparently document their hardware and software components, thereby better meeting regulatory requirements as well as cybersecurity and quality assurance expectations. Even though toys and medical devices are in different risk classes, the reform of toy safety rules shows that the EU considers digital product passports to be a key tool for strengthening safety, transparency, and market surveillance. This approach could set the tone for the regulation of medical devices in the medium term.
For further reading: Regulatory Updates of the past months
Between clarity and complexity: What the new MDCG 2019-11 means for software manufacturers
In summer 2025, the Medical Device Coordination Group (MDCG) published the revised version of its MDCG 2019-11 guidance document. It deals with the qualification and classification of software as a medical device (MDSW) and should finally shed light on the application of MDR Rule 11, especially for Class I software. However, anyone who had hoped that this would put an end to the discussions about prevention, diagnosis and therapy will quickly realise that the issue remains complex.
The new version specifies that software used for disease prevention is only considered a medical device if it actively intervenes in the disease process by analysing physiological parameters or makes specific statements about pathologies. This excludes many lifestyle or wellness apps, but not applications that medically evaluate blood pressure, glucose or movement data, for example.
The revision also provides greater clarity for software with diagnostic or therapeutic purposes. Systems that actively provide medical professionals with information for decision-making clearly fall under Rule 11(a) and thus into higher risk classes. However, this does not apply to pure information or communication software aimed at laypeople. The wording of the intended purpose therefore remains crucial: even small linguistic nuances can determine the risk class.
The guidance also contains new practical examples, such as an app to promote conception or an application that translates spoken language into symbols to facilitate communication. Both examples show that it is not the technology itself that is decisive, but its medical purpose.
For the first time, the MDCG is now also addressing topics such as telemedicine and AI-supported systems. While telemedicine applications generally fall under the MDR as soon as they influence medical decisions, the specific risk classification often remains unclear. The guideline calls for careful case-by-case assessment, particularly for software that uses natural language processing algorithms or large language models. As soon as such systems interpret data or derive clinically relevant values, they are considered medical devices – with all the regulatory consequences that entails.
The revision is an important step towards greater clarity, but it does not eliminate all uncertainties. Questions remain regarding the distinction between Class I and IIa, the handling of adaptive AI components, and the requirements for clinical evidence. For manufacturers, this means that precise intended use, comprehensible risk analysis, and consistent technical documentation remain the key to compliance and regulatory certainty.
At BAYOOCARE, we have extensive experience in formulating intended purposes and risk classification in accordance with MDR Rule 11. We support manufacturers and software developers in classifying their products safely and efficiently within the correct regulatory framework – from early intended purpose to complete technical documentation. Please feel free to contact us if you need assistance in interpreting or implementing the new MDCG guideline.
More freedom in video consultations – but with a safety belt
Since March 2025, new legal regulations have been in place for video consultations, bringing both opportunities and obligations. In future, doctors will be allowed to conduct up to 50 per cent of all treatment cases exclusively via video consultation – twice as many as before. This means that the old 30 per cent limit no longer applies. At the same time, however, new requirements will come into force: from September 2025, video consultations must be given priority to patients in the immediate vicinity, and a structured initial assessment procedure is mandatory for unknown patients before an appointment is made. In addition, video consultations may only be provided domestically; conducting them from abroad is expressly prohibited.
These changes are based on the new Annex 31c BMV-Ä, which was adopted by the National Association of Statutory Health Insurance Physicians and the GKV-Spitzenverband in March 2025. It specifies detailed quality requirements for the provision of telemedicine services. The aim is to provide comprehensive, yet safe and quality-assured care. In future, insured persons must be able to arrange video consultations easily – by telephone, on site, via TI Messenger or appointment service centres. Access must not depend on irrelevant criteria, and appointments must be prioritised exclusively on the basis of medical necessity.
Video consultations outside the contracted doctor’s office are now also permitted, provided they take place at a fully equipped teleworking location that meets the technical, organisational and data protection requirements. Structured follow-up care is also mandatory if treatment via video is medically insufficient. In such cases, doctors must ensure that an on-site appointment, referral or other appropriate follow-up treatment is provided immediately.
Special restrictions apply to unknown patients. They may not be prescribed narcotics or potentially addictive drugs during a video consultation unless the medication is documented in the electronic patient file and structured follow-up care is ensured. In addition, video consultations that serve the sole purpose of a single service, such as issuing a certificate of incapacity for work, will no longer be permitted in future.
The new regulations therefore bring greater flexibility and legal certainty, but at the same time also new formal and technical requirements. Whether they will have the desired broad impact depends on how pragmatically doctors, self-governing bodies and platform providers implement the guidelines. If this is successful, video consultations could finally make the transition from a niche service to everyday care.
AI Act meets MDR: When market surveillance becomes smart
Market surveillance was already a key instrument under the Medical Devices Regulation (MDR) for ensuring the safety and performance of medical devices even after they had been placed on the market. Manufacturers are obliged to operate a post-market surveillance (PMS) system that continuously collects and evaluates data and incorporates it into risk management, clinical evaluation and, if necessary, corrective measures. This system is supplemented by vigilance obligations, i.e. the reporting of serious incidents to the competent authorities.
In recent years, these structures have created the basis for systematic surveillance of medical devices and have long since become part of everyday regulatory practice.
With the AI Act coming into force, this framework is now expanding significantly. Intelligent medical devices based on AI components are generally considered high-risk systems. In addition to the requirements of the MDR, they are subject to the specific monitoring obligations of the AI Act. While the MDR focuses primarily on clinical safety and performance data, the AI Act places greater emphasis on transparency, traceability, cybersecurity and the control of learning processes.
This means that the market surveillance authorities of the Member States will in future be responsible not only for MDR compliance, but also for enforcing AI-specific requirements.
At national level, this means that existing market surveillance structures must be further developed. In Germany, enforcement of the MDR has so far been the responsibility of the federal states, coordinated by bodies such as the German Market Surveillance Forum. The AI Act now additionally requires the designation of central contact points and specialised authorities to act as interfaces between the national and European levels. According to the German government’s plans, existing institutions such as the Federal Network Agency, which is already responsible for other internal market regulations, will be used for this purpose.
The aim is to create a lean and uniform supervisory structure that avoids duplication of work and provides companies with clear points of contact.
In practice, this means a significant consolidation of regulatory requirements. Manufacturers of intelligent medical devices must not only continue their existing PMS systems, but also design them in such a way that they also fulfil AI-related monitoring obligations. This includes, for example, monitoring software updates, checking training data and ensuring that learning systems do not develop unpredictable risks.
At the same time, the harmonisation of the MDR and the AI Act also opens up new opportunities: a coherent European monitoring system strengthens confidence in AI-based medical devices and can prove to be a real mark of quality in international competition.
The result is clear: in future, market surveillance will form an even stronger interface between regulation and innovation than before. While the MDR focuses on clinical safety and efficacy, the AI Act expands the understanding of product safety to include digital and algorithmic dimensions. For manufacturers, this means more effort, but also the opportunity to build trust and position themselves in the market in the long term through a robust, transparent compliance system.
New substances, new rules, stable supply
(Sources: Federal Gazette, Federal Ministry of Health, Federal Institute for Drugs and Medical Devices)
1. Amendment to the Annexes to the Narcotics Act (BtMG)
On 7 August 2025, the Twenty-Fifth Ordinance Amending the Annexes to the Narcotics Act came into force.
The legal basis for this amendment can be found in Section 1 (2) sentence 1 no. 3 BtMG. This provision allows the Federal Government to supplement or amend Annexes I to III of the Act, provided that experts have been consulted and the Federal Council agrees.
The background to the current amendment is the emergence of three new psychoactive substances (NPS) on the drug market:
- Etomethazene
- Fluoro-etonitazene
- Fluoro-etonitazepine
These substances have been added to Annex II of the BtMG. This means that they are marketable but not prescribable. This step was prompted by reports of serious intoxication and deaths resulting from abuse. The inclusion in Annex II is intended to curb distribution and enable criminal prosecution of abuse.
2. Supply shortage of intravenous acetylsalicylic acid (ASA)
The Federal Ministry of Health (BMG) has identified a supply shortage of intravenously administered acetylsalicylic acid.
The legal basis for this is Section 79 (5) of the German Medicines Act (AMG). This provision allows unauthorised medicinal products to be placed on the market for a limited period in emergency situations if they are needed to treat life-threatening conditions and are legally authorised in their country of origin.
The intravenous form of ASA is used in particular for:
- acute treatment of coronary syndrome (including myocardial infarction with or without ST elevation),
- acute, moderate to severe pain,
- migraine attacks in the headache phase,
- and fever, when rapid temperature reduction is necessary and oral administration is not possible.
As there is currently no equivalent alternative therapy available, the state authorities may issue exemptions to ensure supply in accordance with Section 79 (5) and (6) of the German Medicines Act (AMG). The shortage has been published in the Federal Gazette.
3. Revocation of the finding of a supply shortage of sodium chloride solutions
Back in October 2024, the Federal Ministry of Health (BMG) had already identified a supply shortage of isotonic sodium chloride solutions. This allowed the state authorities to temporarily deviate from the strict licensing requirements of the German Medicines Act (AMG) in order to secure supply.
This determination was now lifted in an announcement dated 8 July 2025. According to the Federal Institute for Drugs and Medical Devices (BfArM), the supply situation has stabilised, meaning that special measures are no longer necessary.
Conclusion
Current developments once again demonstrate the close interconnection between drug and narcotics legislation and the protection of public health. While the BtMG has been amended to control new hazardous substances, the BMG is responding flexibly to supply bottlenecks in the pharmaceutical sector.
This creates a dynamic regulatory system that keeps safety, availability and health protection equally in mind.
Focus on update obligations: New responsibilities for manufacturers
With the increasing digitalisation of medical products, manufacturers are also facing a growing responsibility to implement software updates in a structured and active manner. In addition to regulatory developments, such as the revised german Medical Devices Operator Ordinance (MPBetreibV) and the new EU Product Liability Directive, the topic of remote updates is also gaining in importance.
This is not only a question of the technical availability of updates, but also of their actual implementation in practice, including the question of whether and how updates may or even must be installed remotely on networked devices.
A recent article in the Medizinprodukte Journal (Medical Devices Journal) highlights how, in the authors’ view, new regulatory frameworks are leading to increasing expectations of manufacturers:
Software vulnerabilities should not only be detected, but also remedied, as far as technically and organisationally possible. There is increasing discussion as to whether and under what conditions manufacturers could be obliged to carry out remote access to systems that have already been delivered in order to ensure patient safety and data protection.
Overview of relevant developments:
- The updated MPBetreibV (Section 7 (2) sentence 4) explicitly mentions security-related software updates as part of maintenance for the first time.
- The new EU Product Liability Directive refers, among other things, to cyber security risks and requires manufacturers to respond to emerging threats – including through updates, if necessary.
Operators also have a responsibility: they must ensure that security-related updates are installed and may not allow outdated, potentially insecure software to remain in use permanently.
Manufacturers and operators should now ask themselves the following questions:
- Are there structured processes in place for vulnerability management for your products?
- Are there rules governing how updates are delivered, documented and, if necessary, enforced?
- Is communication with operators standardised in the event of critical security vulnerabilities, including integration into the technical documentation?
Legal guidelines for AI in healthcare
With Regulation (EU) 2024/1689 on artificial intelligence, the EU created its first comprehensive legal framework for AI in August 2024. The aim is to strengthen the European single market, promote innovation and, at the same time, protect health and fundamental rights. The regulations will come into force in stages, depending on the risk potential of the respective AI application. The regulation will be binding from August 2026 at the latest, and significantly earlier in particularly high-risk areas. At the same time, a large number of court rulings are already being developed which, although not directly related to medical issues, provide important guidelines for the use of AI in healthcare.
Patent law is of particular importance in this context. The courts have clarified that only humans can be considered inventors and that AI cannot independently claim property rights. At the same time, AI-supported processes in the field of medical technology can be patentable if they solve specific technical problems, such as improving image quality with reduced X-ray doses. In contrast, automated diagnostic procedures remain excluded from patent protection, as these should not be monopolised for ethical reasons. For companies in the healthcare sector, this means that investments in AI-based developments must be specifically targeted at technical innovations in order to be protected by patents.
New questions are also arising in copyright law. Courts have confirmed that AI systems may use copyright-protected data for training purposes under strict conditions, for example for scientific purposes. At the same time, purely AI-generated works are not eligible for protection in many countries because they lack human contribution. For the healthcare sector, this means that the use of data, especially patient data, must be carefully examined from a legal perspective before it is used to train medical AI systems.
Data protection plays a key role. The European Court of Justice has emphasised that even the relevant use of AI scores can constitute an ‘automated decision’ within the meaning of the GDPR, with significant legal requirements for transparency and consent. Particularly in the doctor-patient relationship, treatment decisions must not be delegated entirely to AI, but must ultimately be made by the doctor themselves. Current rulings by German courts on the use of health data by large platform providers also show that the line between legitimate research interests and impermissible data processing has not yet been conclusively drawn.
Finally, liability comes to the fore. Several courts have ruled that operators are liable for incorrect outputs from AI systems, even if they did not actively influence the output. This has particular implications for the healthcare sector: incorrect recommendations or misleading information can have a direct impact on patient safety. Core medical services such as diagnoses or therapies therefore remain highly personal duties and must not be replaced by AI.
Current developments show that case law is gradually adapting to the specific challenges of AI. It almost universally emphasises the need for human involvement and has so far denied AI independent legal personality. For stakeholders in the healthcare sector, this means that innovation must be pursued with caution and in compliance with clear regulatory guidelines. AI systems can provide valuable support. However, they can only be used in a legally compliant manner if patent law, data protection, copyright and liability issues are taken into account from the outset.
MDR sampling practice: burden instead of added value?
The regulation of medical devices in Europe is guided by the principle of guaranteeing safety and performance. There is no question that this goal is justified. However, problems arise when the means chosen to achieve this goal go too far and create more burdens than added value. This is precisely where the MDR’s sampling practice comes under critical scrutiny, a phenomenon that we are also exposed to.
The MDR requires manufacturers to maintain comprehensive technical documentation. In addition, selected documents must be regularly checked in depth on a random basis. What sounds like a sensible control mechanism in theory has developed in practice into a system that hardly distinguishes between risk, complexity and actual knowledge gain. Even products with a clearly defined range of functions and no risk-relevant changes are subject to a complete reassessment. This means repeated checks of identical documents without any changes to the product design, data or risk situation.
Legally, this practice is justified by the fact that additional control at product level is necessary to ensure conformity not only abstractly via the quality management system, but also concretely via documentation. This preventive approach is understandable in its basic idea and is intended to strengthen confidence in the system. In practical application, however, it is apparent that this instrument overstretches its purpose. Complete tests are carried out repeatedly, even if no new safety or performance-related findings are to be expected. Risk-based differentiation is replaced by a formal automatism that misses the actual purpose of the regulation – to effectively increase patient safety – and turns into a procedure that primarily ties up resources and inhibits innovation.
Experience with this common practice clearly shows that the sampling requirement urgently needs to be revised. Controls are undoubtedly important, but they must be based on the actual risk. Repeated full inspections without gaining new insights contradict this idea and result in regulatory processes becoming more of a burden than a contribution to safety. Our aim is therefore to critically question this practice and push for a return to what regulation should actually achieve: building trust, maintaining proportionality and enabling innovation. Only in this way can regulation fulfil its task of balancing safety and progress in healthcare.
New operator responsibility for AI in medicine – human oversight as a regulatory obligation
The new EU Regulation 2024/1899 on Artificial Intelligence (AI Regulation) clarifies for the first time that healthcare facilities such as hospitals are considered operators when they use high-risk AI-based medical devices, regardless of whether they develop them themselves or only use them. This results in new independent obligations, in particular with regard to human supervision of the systems used.
These include, among other things:
The regulation shifts the perspective from the traditional technical “user” (as previously in the MDR) to an operator who is ethically and legally responsible. Black box models and adaptive systems are particularly targeted by the regulation, as they are difficult to understand and influence potentially irreversible decisions.
For hospitals, this means adapting quality management systems, closer integration with data protection, IT security, and ethics committees, and a new culture of critical engagement with AI systems in everyday clinical practice.
When is a product a medical device? Two court cases bring (some) clarity
Two current cases in Germany highlight key issues of interpretation of the MDR that are crucial for many manufacturers. The Bochum Regional Court has referred the question to the European Court of Justice as to whether unprinted patient wristbands, which are intended to help prevent errors in everyday hospital practice, should already be classified as medical devices, even though they have no direct effect on or in the body. The issue at stake is the significance of the intended purpose and whether a purely organizational benefit is sufficient for classification. An ECJ decision is not expected until the end of 2025 at the earliest.
Even more far-reaching is the ruling of the Hanseatic Higher Regional Court on a teledermatology app. Although the app itself does not make a diagnosis, but only transmits structured health data to medical specialists, it was classified as a Class IIa medical device. The court interprets Rule 11 of the MDR broadly: it is sufficient for software to contribute to medical decisions; an active diagnostic function is not necessary. This makes it clear that software no longer automatically falls into the lowest risk class.
Both cases show that MDR-related issues such as intended use and risk classification are increasingly being clarified in court – with major implications for approval requirements, competitive conditions, and market opportunities. Manufacturers who invested in compliance early on are increasingly finding themselves at a regulatory disadvantage compared to less compliant competitors. Uniform and binding standards are therefore urgently needed.
Cannabis platforms targeted by the judiciary – between referral bans, advertising law, and criminal liability
Two recent rulings by the Higher Regional Court of Frankfurt and the Regional Court of Hamburg highlight the legal risks of digital business models for the distribution of medical cannabis via internet platforms. In both cases, it was found that key provisions of professional and advertising law had been violated, in particular the prohibition on referrals (Section 31 MBO-Ä) and the advertising restrictions under the German Medicines Advertising Act (Sections 9, 10 HWG). Among other things, criticism was levelled at the one-sided fee structure at the expense of doctors, which was considered a hidden commission, as well as misleading advertising claims about remote treatment with cannabis. General advertising formats aimed at promoting sales were also prohibited, even if no specific product names were mentioned.
Another controversial issue is the possible criminal relevance: the construction of such platform cooperations could be considered corruption in the healthcare sector (Sections 299a, b StGB) or trading in medical cannabis subject to authorization (Section 25 MedCanG). The new Medical Cannabis Act sets out clear licensing requirements for any form of commercial trade in cannabis, including via digital channels.
The rulings show that what begins as a modern, digital care model can quickly become a case for the public prosecutor’s office and supervisory authorities. Providers of such platforms urgently need to have their structures reviewed from a legal perspective in order to avoid risks in professional, advertising, and criminal law. The judiciary is increasingly critical of the close link between economic interests and medical services.
Legal changes in healthcare – What’s new
In spring 2025, numerous new legal regulations came into force in the healthcare sector, affecting key areas of care. The Healthcare Strengthening Act (GVSG) excluded GP services from budgeting, removed the age limit for emergency contraception, and extended the deadlines for consultation procedures at the G-BA, among other things. The revised Medical Device Operator Ordinance (MPBetreibV) and the amended Medical Device Dispensing Ordinance (MPAV) now regulate the CE-compliant reprocessing of single-use products and relax the dispensing requirements for in vitro diagnostics. At the same time, the 22nd AMVV amendment made 29 new active ingredients prescription-only and made adjustments to e-prescriptions. Further changes concern the interchangeability of drug forms (e.g., levodropropizine syrup), the continuation of price moratoriums and manufacturer discounts, and a targeted price increase for atropine to ensure supply. Overall, the measures show stronger government control in the direction of supply security, economic efficiency, and regulatory clarity.
Single Studies: Pioneers for Biomarker-Personalized Medicines and Their Diagnostic Precision
Paradigm Shift in Drug Development: Individualization of Therapy
Modern drug development is undergoing a fundamental shift toward therapy individualization – particularly in oncology, immunotherapies, and increasingly in rare diseases. In these fields, treatment effectiveness often depends on whether the patient exhibits certain genetic or molecular characteristics.
Such characteristics, called biomarkers, can only be reliably identified through specific tests.
Companion Diagnostics (CDx): Key to Personalized Medicine
This is where Companion Diagnostics (CDx) come into play: highly specialized in-vitro diagnostics developed specifically for this purpose. The joint development and testing of a drug and its associated CDx within a single clinical study – a so-called Single Study – enables targeted patient selection and increases the likelihood of therapeutic efficacy.
Single Studies in Oncology: Proven Practice for Maximum Effectiveness
In oncology, this form of study planning has long been common practice: Only patients with a specific tumor profile are included, significantly increasing the study’s validity and the drug’s benefit for defined subgroups. This approach is also suitable for rare diseases with limited patient numbers and high diagnostic uncertainty.
Regulatory Challenges in Europe
In Europe, implementing such studies is complex, as drugs and diagnostics follow two separate regulatory paths – pharmaceutical law (EU-CTR) on one hand and IVDR on the other. Regulatory interconnection is limited, while the practical need for coordination is high.
Solution Approach for Early Development Phases
Especially in early phases of clinical development, collaboration with academic laboratories using in-house CDx can be a viable way to implement biomarker-based selection strategies in compliance with regulations even before CE marking of the CDx. In parallel, the Companion Diagnostic can be brought to market maturity through a conformity assessment procedure.
BAYOOCARE: Your Experienced Partner for CDx Projects
BAYOOCARE has extensive experience in handling IVDR and planning combined development projects. In collaboration with research-intensive partners like Heidelberg University Hospital, Mainz University Medicine, and Marburg University Hospital, we support studies where diagnostics and therapy are considered together from the beginning.
BAYOOCARE supports as Legal Manufacturer in regulatorily complex CDx projects – experienced, networked, and IVDR-compliant.
Health Apps Between Therapeutic Claims and Legal Boundaries: What the Medical Practice Reservation Can Achieve – and What It Cannot
Digitalization of Traditional Medicine: The New Role of Health Apps
Digital health applications are increasingly taking over functions traditionally located in medicine – they analyze symptoms, structure treatment courses, remind about medications, or provide behavioral recommendations for chronic diseases.
Public discussion repeatedly raises the question of whether such applications should already be considered “practice of medicine” – and whether they consequently fall under the classic medical practice reservation according to the Alternative Practitioner Act (HeilprG).
Legal Classification: Why the HeilprG Doesn’t Apply
However, legally much speaks against this. The HeilprG explicitly addresses natural persons – “who” practices medicine – but not technical systems or software. Health apps don’t make independent medical decisions but operate within predetermined algorithmic processes.
The developers or distributors of the software also cannot be meaningfully integrated into the system of medical licensing requirements. The regulatory logic of the HeilprG doesn’t fit digital applications – neither systematically nor historically.
Modern Regulatory Approaches Instead of Regulatory Gap
Moreover, there’s no reason to close a “regulatory gap” through analogous application of the HeilprG. Digital medical devices today – at least since the introduction of the MDR – are subject to a differentiated safety and quality assurance regime: risk classification, CE marking, conformity assessment procedures, and clinical performance evaluation ensure that health apps with therapeutic claims are reviewed before market entry.
DiGA Directory: Additional Quality Assurance
For digital health applications (DiGAs) seeking inclusion in the BfArM directory, additional requirements apply regarding data protection, interoperability, and proven care effects.
Comparison of Different Regulatory Approaches
Compared to traditional alternative practitioners, whose professional practice largely moves outside state-standardized quality standards, modern health apps are thus – also regulatorily – not uncontrolled. While a difference remains in the form of health protection: While for alternative practitioners personal permission and civil liability are in the foreground, medical device law and product liability secure app use technically and systematically.
This different design appears logical, as digital applications are not medical practitioners in the classical sense.
Independent Regulatory Approach for Digital Health Products
Health apps with therapeutic benefits thus don’t operate in a legal vacuum, but also not in classic medical professional law. With the MDR and DiGA framework, the legislature has deliberately chosen an independent regulatory approach that evaluates software and AI-based medical devices according to technical, functional, and care-related criteria – not according to person-related access barriers.
This model is currently constitutionally viable and simultaneously open to future adjustments if new risks or deficits emerge.
BAYOOCARE: Your Partner for Compliance and DiGA Approval
BAYOOCARE supports manufacturers of digital health applications in implementing the MDR and on the path to DiGA listing – competent, regulatorily experienced, and at the interface of technology and law.
No Admission of Guilt Through Regret – Veterinary Voice Message Insufficient for Liability
Current Court Decision Creates Clarity
In a current decision, the Dresden Higher Regional Court clarified: A veterinarian’s regret about a dog’s death, expressed in a voice message, does not constitute an admission of guilt in civil law terms (OLG Dresden, Decision of 6.1.2025 – 4 U 1192/24).
The Underlying Legal Dispute
The owner of the deceased dog had claimed that the veterinary operations for treatment and removal of an implant plate were faulty and causally responsible for her animal’s death. She relied, among other things, on a voice message from the treating veterinarian, in which she stated it was “terribly sorry.”
Court’s Legal Assessment
However, the court clarified that such a statement represents neither a constitutive nor a declaratory acknowledgment within the meaning of § 781 BGB. An express intention to be legally bound is not recognizable, especially since the medical relationship ended with the animal’s death anyway. The regret should be viewed as an expression of human compassion – not as assumption of liability.
Significance for Medical Communication
The decision shows that not every compassionate statement after an unfortunate treatment outcome can or should be legally interpreted as an admission. Anyone who understands every compassionate word from doctors or veterinarians as a legal admission of guilt not only misunderstands the emotional complexity of such situations but also demands an untenable legalization of human communication.
The attempt to derive liability-relevant consequences from empathy simply goes too far.
Digital psychological applications (DPA) – opportunities and regulatory challenges
Digitalization has brought about significant innovations in medical care, particularly in the area of psychological health applications. Digital psychological applications (DPA) are becoming increasingly relevant as they complement traditional therapeutic approaches and enable low-threshold access to psychological support.
The integration of artificial intelligence significantly expands the potential of these applications, but at the same time entails complex regulatory challenges.
DPA as a driver of innovation in the healthcare sector
The spectrum of DPA covers a wide range of applications, including therapeutic apps, digital coaching programs and AI-supported chatbots that can respond to users’ behaviour and mood in real time. These systems offer a flexible and cost-effective supplement to conventional treatments and help to bridge the often criticized “therapy gap”.
During the COVID-19 pandemic, the increasing demand for psychological support services became particularly evident as the need for alternative forms of care increased dramatically. At the same time, the DPA sector is developing into an economic growth industry whose development potential is significantly influenced by the regulatory framework.
Regulatory challenges and the new AI regulation
The new EU Regulation on Artificial Intelligence (AI Regulation) places digital psychological applications in a new regulatory framework. The classification of AI-supported applications with health relevance as high-risk AI has far-reaching consequences for developers and providers:
Possible impact on the market
Although some of the new requirements serve to protect patients, there is a risk that innovation will be hampered. Small and medium-sized companies in particular could experience competitive disadvantages due to increased compliance costs and regulatory hurdles. Providers could be forced to forego innovative functionalities in order to avoid a high-risk classification and the associated requirements.
It seems particularly problematic that the regulation does not provide for any specific rules for digital psychological applications, but rather classifies them as a blanket part of general AI regulation. This could impair the availability of effective, evidence-based DPA in Europe, while less regulated international markets benefit from technological advances.
Recommended actions for companies
Given the changing regulatory landscape, companies operating in the DPA sector should be proactive.
Conclusion
Digital psychological applications have the potential to sustainably improve mental healthcare. However, the new AI regulation poses significant challenges for the industry. While increased safety requirements can strengthen the protection of patients, there is a risk that excessive regulatory hurdles will inhibit innovation and make access to modern psychological support services more difficult.
It is therefore advisable for companies to address the new requirements at an early stage and adapt their strategies accordingly.
Desire to have children after death: Frankfurt Regional Court rules in case of post-mortem family planning
The recent decision of the Frankfurt Regional Court raises significant ethical and legal questions in the field of reproductive medicine. In a remarkable case, a woman was granted the right to receive the cryopreserved sperm of her deceased husband in order to undergo artificial insemination in Spain.
Background to the decision
The clinic in question had initially refused to hand over the preserved genetic material and invoked a contractually agreed “destruction clause”. However, the Frankfurt Regional Court came to the conclusion that the contract did not necessarily provide for the destruction of the sperm and that the protective purpose of Section 4 of the Embryo Protection Act was not violated.
Consideration of the patient’s will
The verifiably documented will of the deceased husband was of central importance for the ruling. The wife was able to credibly demonstrate to the court that her deceased partner had expressly spoken out in favor of the post-mortem use of his genetic material before his death. This expression of will was recognized by the court as a decisive factor in the decision-making process.
Conclusion
This legal decision could set an important precedent that raises important questions about self-determination in family planning, the handling of genetic material after death and cross-border medical care. The ruling makes it clear that case law in this sensitive area must carefully weigh up the individual circumstances, ethical dimensions and the express will of all those involved.
The decision opens up an important social discourse on the limits and possibilities of modern reproductive medicine and the associated ethical and legal challenges.
Digital health applications (DiGA): Data protection as the foundation for trust and progress in healthcare
The introduction of DiGA has the potential to fundamentally change medical care. These applications, which are prescribed by doctors or psychotherapists, facilitate the treatment, monitoring and alleviation of illnesses. Since the introduction of the Digital Healthcare Act in 2019, such applications can be reimbursed by statutory health insurance under certain conditions. At the heart of this development is data protection – an issue that is crucial to the success of digital solutions in the healthcare sector.
The fast-track procedure of the Federal Institute for Drugs and Medical Devices checks whether a digital health application meets the necessary requirements. In addition to technical and functional aspects, the focus here is on compliance with data protection regulations. Health data is among the most sensitive personal data of all.
Providers must therefore prove that their applications are not only secure, but also comply with the legal regulations defined by the General Data Protection Regulation and the Digital Health Applications Regulation.
Key data protection requirements
These include
Processing may only be carried out for medically necessary purposes, such as
Advertising or detailed tracking of user activities are strictly prohibited. Such requirements not only protect the rights of patients, but also strengthen trust in digital solutions.
The challenging road to certification
Despite these positive aspects, the path to certification remains a challenge. Smaller companies in particular often struggle with the high demands that require resources and expertise. Nevertheless, it is essential to maintain a balance between promoting innovation and protecting healthcare data. After all, only a system that creates trust can establish itself in the healthcare sector in the long term.
The responsibility for the success of digital health applications does not lie solely with the manufacturers. Interaction with legislators, health insurance companies and medical professionals is also crucial.
Transparency in the processes and clear communication of requirements promote understanding and facilitate cooperation between all parties involved.
Conclusion
DiGA offer the opportunity to close gaps in care and actively involve patients in their treatment. They open up new avenues in diagnostics and therapy and help to increase the efficiency and precision of medical care.
At the same time, they place high demands on providers, but meeting these demands can make all the difference.
In a world where the value of data is constantly growing, the protection of this information remains the foundation of any innovation. Digital health applications show that progress and data protection can go hand in hand – provided they are implemented with the necessary care and foresight.
Finding this balance and constantly developing it further is not only a challenge, but also an opportunity for everyone involved in shaping the future of healthcare.
Why the Frankfurt Higher Regional Court banned a manufacturer of dextrose products from combating hangovers
The party was legendary, the morning after less so – and this is exactly where a manufacturer of dextrose products wanted to score points with its mineral supplements as an “anti-hangover” miracle worker. However, the Frankfurt Higher Regional Court has drawn a clear line through this advertising.
What had happened?
The manufacturer of dextrose products advertised its mineral tablets on Amazon as an “anti-hangover” aid. The idea: a small drop after partying and the typical symptoms of a hangover are supposed to be a thing of the past. Sounds practical, doesn’t it? Unfortunately, too practical – and legally untenable.
The judgment in detail
The Higher Regional Court of Frankfurt ruled that, in accordance with the European Food Information Regulation, foods may not be attributed any healing or disease-preventing properties. And yes, a hangover officially counts as an illness here.
The reasoning:
What does this mean for the manufacturer?
The advertising on Amazon must be revised. Advertising the drops as “anti-hangover” supplements is no longer permitted. However, the judgment was issued as a default judgment and is not yet legally binding. The supplier can lodge an appeal and continue the proceedings.
Cyber Resilience Act, medical devices and the AI Act – An integrated perspective on cybersecurity and AI in the healthcare industry
With the Cyber Resilience Act (CRA) and the upcoming AI Act, the EU is setting new requirements for IT security and the use of artificial intelligence (AI).
These regulations play a central role, particularly in the area of digital healthcare products, which include connected medical devices. The new regulations set standards that require an integrated safety strategy in order to make innovations such as AI in medicine safe and compliant.
CRA and AI Act – safety standards for digital health products
The CRA aims to harmonize IT security standards for products with digital elements and sets strict requirements for networked devices and software. Among other things, it requires security updates, compliance with a secure standard configuration level and detailed documentation, such as a software bill of materials (SBOM).
The AI Act supplements these safety requirements with specific specifications for AI-supported systems and products that are classified as “high-risk”, which applies to many applications in medicine.
The combination of these regulations is intended to ensure comprehensive risk management for digital health products, especially when AI is used for diagnoses or patient-specific therapies.
Integration into the requirements of the MDR and IVDR
Medical devices, especially those that are based on AI or are networked, are already subject to the MDR and IVDR. However, the CRA and the AI Act create additional standards and requirements that are particularly important for manufacturers of digital health products. These must integrate the following aspects:
Overlaps and synergies – an integrated approach to security
The combination of CRA, AI Act, MDR and IVDR creates a comprehensive framework for the safety of digital health products. Manufacturers must prepare for new compliance challenges, especially if their products are both connected and AI-powered.
For example, the CRA introduces basic safety requirements that should be combined with the specific AI risk requirements of the AI Act. This means:
Conclusion
With the CRA and the AI Act, the EU is creating a vision for the future of digital health that combines innovation and safety. Manufacturers of connected and AI-supported medical devices are required to adapt their development processes and implement new standards for cybersecurity and AI.
In this way, products can be created that meet the highest safety requirements and at the same time fulfill the demand for innovation in the healthcare sector.
Pharmaceutical order data and data protection – current issues before the ECJ
The German Federal Court of Justice (BGH) recently referred a question to the European Court of Justice (ECJ) regarding pharmaceutical order data, which could have a significant impact on data protection in the online retail sector.
Are pharmaceutical order data health data?
The focus is on the question of whether order data for pharmacy-only but non-prescription medicines is already considered “health data” in accordance with Art. 9 para. 1 of the General Data Protection Regulation (GDPR). Customers who order medicines via platforms such as Amazon provide information such as their name, delivery address and details that are necessary for the individual composition of the medicine.
The BGH asks whether this information – even if it does not provide any direct reference to the use or medical prescription of the medicine by the purchaser – nevertheless allows conclusions to be drawn about the state of health and should therefore fall under the special protection of health data.
If the ECJ confirms that the purchase of pharmacy-only medicines already discloses health data, pharmacies selling via online marketplaces could be subject to stricter data protection requirements and explicit consent in the ordering process. This would strengthen data protection for customers, but would also have far-reaching consequences for the online pharmaceutical trade.
Other important data protection issues
At the same time, two further issues are to be decided that could have a significant impact on the topic of data protection in online sales:
Competition law claims for data protection violations: The ECJ is to clarify whether competitors can assert data protection violations under competition law. This means that pharmacies could also be sued for injunctive relief by competitors if they do not comply with data protection regulations. This would be a significant step towards strengthening data protection standards, but could also increase the legal burden on companies.
Consent and processing of special categories of personal data: If pharmaceutical order data is considered health data, it will be clarified that pharmacy customers must give explicit consent for their data to be processed. This could achieve a higher level of protection, especially when processed by third parties such as large online platforms.
These issues underline the growing importance of data protection in e-commerce and the wide-ranging impact that an ECJ ruling could have on the online sale of sensitive products such as medicines.
Current updates in patent law: implications for medical device manufacturers
In recent months, two important decisions by the Federal Court of Justice (BGH) and the Karlsruhe Higher Regional Court (OLG) have had a significant impact on patent law in Germany.
This poses new challenges for manufacturers of medical devices in particular, as the requirements for claims for damages and destruction have been tightened.
Compensation for follow-up business after patent infringement: What does the BGH ruling on the “upholstery machine” mean?
One example of this is the BGH ruling on the “upholstery machine” from November 2023.
Here, it became clear that patent infringements not only affect the sales price of a product, but also all sales from downstream transactions such as maintenance or consumables. The manufacturer of the infringing machine not only had to transfer its profits from the sale, but also those from downstream transactions.
Particularly explosive: the patent holder’s claim for damages applied not only to the term of the patent, but also to transactions made after the patent expired. This means a considerable and long-lasting financial burden for affected manufacturers.
This ruling has far-reaching consequences for the medical device industry. Companies that manufacture durable products must ensure that they are not only protected during the patent term, but must also be prepared to be confronted with claims for damages years after the patent has expired.
Early workarounds crucial: court demands modification in case of patent infringement
The Karlsruhe Higher Regional Court also dealt with a case in which a medical device manufacturer had used a patent-infringing control unit in its pressure wave treatment devices. Although the manufacturer carried out a software update in the course of the proceedings that eliminated the infringement, the court confirmed the patent holder’s claim for destruction.
However, the court ruled that the devices in question did not have to be completely destroyed, but had to be modified under the supervision of a bailiff. This shows how important it is to develop technical alternatives (so-called workarounds) at an early stage.
Conclusion
For medical device manufacturers, this means that it is often not enough to stop the production of patent-infringing products. Devices already on the market must also be adapted in order to avoid future patent infringements. This can not only be expensive, but also requires extensive logistical measures.
There are some important lessons to be learned from these judgments: Careful attention should be paid to potential patent conflicts as early as the product development stage. In addition, technical alternatives should be available in order to be able to react quickly in the event of a legal dispute.
Close cooperation with patent and legal departments is essential in order to find solutions and prevent major economic damage. It is particularly important to review and adapt existing products on the market in good time before expensive patent disputes arise.
Overall, these judgments show how complex patent law has become. It is becoming increasingly important for manufacturers of medical devices to act strategically and with foresight in order to protect themselves from the growing legal risks.
Further news
NIS-2: New cybersecurity obligations for medical device manufacturers
With the entry into force of the NIS-2 Directive on January 16, 2023, manufacturers of medical devices in the European Union will have to meet stricter cybersecurity requirements in the future. This directive is an extension of the previous regulations and affects not only healthcare providers, but now also players such as medical device and in-vitro diagnostics manufacturers.
The focus is on a minimum catalog of obligations that all affected institutions must implement.
National implementation of the NIS 2 Directive
The EU member states must transpose the directive into national law by October 17, 2024. In Germany, the draft “NIS2UmsuCG-E” (NIS-2-Umsetzungs- und Cybersicherheitsstärkungsgesetz) provides for a specific implementation of the obligations, which primarily include comprehensive cybersecurity risk management and reporting obligations in the event of security incidents. Similar regulations can be found in the Austrian draft of the NISG 2024 and in the Hungarian implementation law, which has been in force since January 2024.
Essential requirements
The NIS 2 directive requires manufacturers to implement a comprehensive information security management system (ISMS). This system must reflect the current state of the art and help to minimize security risks for IT and network systems. It is essential for affected companies to prepare for these new requirements at an early stage, as non-compliance can result in severe fines.
Conclusion
For manufacturers of medical devices, the NIS 2 Directive means a significant expansion of cybersecurity obligations. It is advisable to closely monitor national developments and start implementing the minimum requirements now in order to avoid future risks.
Implementation of video consultations outside the contract doctor’s practice: New flexibility through § 24 (8) of German Authorisation regulation for statuary insurance physicians (Ärzte-ZV)
The introduction of Section § 24 (8) Ärzte-ZV represents an important step towards making telemedicine more flexible. SHI-approved physicians can now also provide their video consultations outside of their contract doctor’s office – for example in branch practices, outsourced practice rooms or in the home office.
This change, which was introduced by the Act to Accelerate the Digitization of the Healthcare System (DigiG), makes it easier to conduct video consultations and improves the compatibility of professional and private life.
Core elements of the new regulation
The new regulation allows doctors to hold video consultations at alternative locations as long as the obligations regarding minimum consultation hours and open consultation hours at the contract doctor’s practice are complied with. Video consultations outside the contract doctor’s practice are only permitted in addition to these face-to-face consultations and are still subject to the general regulations on medical care and data protection.
Conclusion
With this innovation, the legislator has responded to the demand for more flexibility in telemedicine. Panel doctors benefit from the new freedom to hold their consultations outside of the practice premises, which can increase the attractiveness of medical practice in particular.
Remuneration negotiations for digital health applications
Since 2020, digital health applications (DiGA) have been reimbursed by the statutory health insurance funds. This process, regulated by Section 33a SGB V, enables DiGA to be included in the statutory health insurance (SHI) benefits catalog.
As a result, they are potentially available to all insured persons. However, this process poses a considerable challenge for manufacturers of digital health solutions.
The development and market launch of a DiGA are associated with numerous hurdles. Following the complex development process, the DiGA must first be included in the DiGA directory of the Federal Institute for Drugs and Medical Devices (BfArM) in accordance with Section 139e SGB V.
This is followed by the reimbursement negotiations between the National Association of Statutory Health Insurance Funds and the manufacturer, as stipulated in §134 SGB V. Finally, the DiGA must be prescribed by a treating physician and reimbursed by the health insurance company.
Remuneration agreement: How is the reimbursement price determined?
A key aspect of the remuneration agreement is the proof of positive care effects in accordance with Section 134 (4) SGB V. From 2026, a performance-based component will be included in the reimbursement agreement, which will fundamentally change the reimbursement model. The manufacturer sets the reimbursement price itself in the first year – on average this is around 510 euros. After that, a price negotiated with the National Association of Statutory Health Insurance Funds (GKV-Spitzenverband) comes into effect, which averages around 221 euros. However, this price can also be significantly lower, with discounts of up to 60 percent.
Negotiating the amount of remuneration is often a challenge, especially for start-ups that dominate the market but have limited financial and human resources. Rapid monetization of the product idea is therefore essential. Large pharmaceutical companies often cooperate with start-ups instead of entering the market themselves.
Decisive factor: the negotiation strategy
Nevertheless, the negotiated remuneration amount is not always sufficient to secure the company’s economic existence. Some DiGA manufacturers have already had to file for insolvency even though their products were successfully prescribed. This shows how important a careful negotiation strategy is.
Manufacturers should carefully analyze comparable SHI benefits, the evidence base, product characteristics and economic considerations. A tailored argumentation that meets the requirements of the arbitration board is essential. Particular attention needs to be paid to identifying weaknesses in the evidence assessment – this requires the most preparation time.
DiGA as a statutory benefit
In addition, manufacturers should consider the possibility of offering their DiGA as a statutory benefit in accordance with Section 11 (6) SGB V. This could represent a competitive advantage, as the services are not published and can achieve higher prices with individual health insurance funds, but this has the disadvantage that the products are only offered to the respective health insurance fund members.
The negotiation procedure
Negotiations with the National Association of Statutory Health Insurance Funds usually take place in Berlin or online in three meetings of three hours each. Following a successful agreement, a contract under public law is concluded between the manufacturer and the GKV-Spitzenverband, which can be terminated with a notice period of three months to the end of the quarter. If no agreement is reached, the arbitration board will decide on the remuneration amount within three months. The adjusted supply costs as well as self-payer prices and prices in other EU countries are taken into account.
Conclusion
The remuneration process for DiGA is complex and places high demands on manufacturers. Careful preparation and a well thought-out negotiation strategy are essential in order to secure an appropriate remuneration amount and ensure the economic survival of the product. Manufacturers should also consider alternative marketing strategies in order to strengthen their competitive position and minimize potential risks. However, there is no guarantee that these strategies will lead to effective results with the health insurance funds.
Further news
From theory to practice: effectively implementing the EU AI Regulation
The EU Regulation laying down harmonized rules on Artificial Intelligence (AI Regulation) marks a decisive step towards the regulation of AI systems in the EU. It has been gradually coming into force since August 2, 2024 and has since set clear rules for the use of AI systems that require companies to make extensive adjustments and take compliance measures.
The increasing spread of AI creates both opportunities and risks. In order to protect privacy and fundamental rights, the EU AI Regulation created a legal framework to ensure the safe and responsible use of AI systems.
The legislative process for the AI Regulation began in March 2024 and ended in July 2024. The Regulation came into force in August 2024 and will be successively extended until 2027. This gives companies time to prepare for the new requirements.
This is what the AI Regulation contains
The AI Regulation is a specialized product safety law that focuses on AI products themselves. It applies to providers and operators of AI systems in the EU, regardless of where they are based, with a few exceptions for national safety issues and private use.
A central element of the AI Regulation is the categorization of AI systems. These are machine systems that can autonomously derive predictions, recommendations or decisions from inputs. The regulation covers systems that use machine learning and knowledge-based approaches and distinguishes between four main categories of AI systems:
This applies to providers and operators in accordance with the AI Regulation
The AI Regulation sets out extensive obligations for providers and operators. Companies must identify their AI systems and classify them into risk categories. Strict requirements apply to high-risk systems in particular. Providers must create detailed documentation covering the entire life cycle of the system, and operators must ensure that human supervision is possible.
Violations of the provisions of the AI Regulation can result in high financial penalties, with fines of up to 35 million euros or 7 percent of annual global turnover. In addition, civil law claims can be enforced more easily in the event of violations.
Recommendations for companies
Companies should start implementing an AI compliance framework immediately. This includes mapping all AI systems and classifying them into risk categories. Measures to comply with the regulation must be implemented promptly. Employee training is essential to raise awareness of the opportunities and risks of AI systems and avoid compliance violations.
It is advisable to create a central office or committee for AI matters. This should ensure that legal requirements are complied with and risks are identified at an early stage. The involvement of experts from compliance, IT, data protection and risk management is crucial.
Judgment of the Higher Regional Court of Hamburg on the risk classification of medical device software
After we had already discussed the (decision) proceedings of the Higher Regional Court of Hamburg in an editorial on our website, the Higher Regional Court of Hamburg now issued a judgment on 20.06.2024 under file number 3 U 3/24. This ruling could represent a significant decision on the risk classification of software as a medical device.
What was it about?
The focus was on an app that enables patients to send images of skin conditions to selected dermatologists, fill out a medical history questionnaire and provide personal data. The app was declared as a CE-marked medical device in risk class I in accordance with Rule 11 of the Medical Devices Regulation (EU) 2017/745. A competitor took legal action against this risk classification, initially seeking interim legal protection.
At the heart of the dispute was the question of whether the software was correctly classified and could therefore be placed on the market. The opposing party argued that the app should not be classified as a medical device in risk class I, but as risk class IIa or higher. According to Rule 11 of the Medical Devices Regulation (EU) 2017/745, software used to support diagnostic or therapeutic decisions is classified as Class IIa. The defendant, on the other hand, took the view that its app was merely a data transmission and communication solution and therefore belonged in the lower risk class I.
The Higher Regional Court Hamburg ruled as follows
The court followed the opposing party’s argument and found that the app not only transmits data, but also intervenes in the diagnostic process by automatically adapting the medical history questionnaire. This happens independently of individual medical decisions and thus influences the basis for the medical diagnosis and therapy decision.
Therefore, the app is to be classified as a Class IIa or higher medical device, which requires more extensive certification, the involvement of the Notified Body and the affixing of the CE mark by the latter. Consequently, the defendant was ordered not to place the corresponding medical device on the market or make it available on the market as long as it is not certified as a medical device in risk class IIa or higher.
Criticism from legal experts and expert associations
As was already the case in the previous proceedings, the court’s decision will be subject to much criticism from legal experts and expert associations. Based on the court’s reasoning, there is already a risk that all medical software could potentially be classified as risk classification IIa or higher.
What now?
This ruling highlights the complex legal situation regarding the classification of medical software applications and could also have an impact on already approved digital health applications (DiGA) that are financed by statutory health insurance (SHI). Companies that develop or distribute such products should consider the following points:
Further news
Stigmatized diseases are uploaded less frequently to the electronic patient record (EPR)
A new study shows that patients are less likely to upload stigmatized diseases, such as depression or sexually transmitted diseases, to their electronic patient record (EPR), depriving treating physicians of important information. In contrast, non-stigmatized diseases such as a broken wrist or type 1 diabetes mellitus do not influence upload behavior.
The research by Niklas von Kalckreuth and Prof. Dr. Markus Feufel from the TU Berlin, published in the journal JMIR Human Factors, investigates the influence of disease characteristics on upload behavior to the EPR. The study shows that the social stigmatization of certain diseases leads to a reluctance of patients to share these diagnoses digitally – regardless of whether the disease is acute or chronic.
Niklas von Kalckreuth explains that previous surveys indicated a high willingness to use the EPR, but that other studies have shown that actual use often falls short of the stated intentions. This indicates a gap between stated intention and actual behavior, which has important implications for the implementation and use of the EPR.
GKV on digital health applications (DiGA): Too few, too expensive and without significant effects?
The German public health insurances (GKV) has published a report on the current provision of digital health applications (DiGA). The results shed a critical light on the successes and challenges of this innovative area of healthcare to date.
By October 2022, only 33 applications had found their way into the DiGA directory. In total, DiGA had been used around 164,000 times by September 30, 2022, resulting in service expenditure of 55.5 million euros. According to the umbrella organization, however, it is questionable whether this can be considered a success.
The report shows that the majority of applications were unable to demonstrate any positive effects on care when they were included in the DiGA directory. Two thirds of DiGAs were only provisionally included in the directory. Of the permanently approved applications, three were not included in the full scope of indications, and three trial DiGAs were even completely removed from the directory.
Another major problem is the high prices charged by manufacturers for their DiGAs. On average, these are around 500 euros per quarter. Trial DiGAs, which have no proven positive supply effect, are particularly expensive, with prices ranging from 600 to 952 euros. Even the maximum amounts introduced on October 1, 2022 could not significantly reduce the high price level.
The report by the German public health insurances clearly shows that the introduction and use of DiGA has so far been associated with considerable challenges. The high costs and the lack of evidence of positive healthcare effects in many applications call into question the sustainable integration of these digital healthcare solutions. There is an urgent need for measures to improve the cost efficiency and effectiveness of DiGA in order to fully exploit the potential of this technology.
The view of the umbrella organization is viewed very critically by service providers.
ECJ: TC string is personal data
In a recent ruling, the European Court of Justice (ECJ) classified the TC string, an alphanumeric character string that encodes the user’s consent preferences for online advertising, as personal data. This decision is based on Art. 4 No. 1 GDPR, according to which personal data means any information relating to an identified or identifiable natural person. Identification can also take place indirectly through assignment to an online identifier.
The ECJ has confirmed that it is sufficient for identifiability if a person can be indirectly identified by additional information. It is not necessary for all information to be available for a single person. Personal data also includes all information resulting from the processing of such data.
The TC-String contains the user’s preferences, which, according to the ECJ, can contribute to direct identification. In addition, the TC-String information can be used to create a user profile and identify the person concerned via an identifier, such as the IP address. The fact that the defendant itself cannot link the TC string to the IP address does not change this, as the relevant information does not have to be in the hands of a single person.
The defendant can use legal means to obtain information from its members that enables user identification. Therefore, the TC-String is to be considered as personal data.
The ECJ confirms its previous case law that the concept of personal data must be interpreted broadly. The decision makes it clear that information such as the TC-String, which can indirectly contribute to the identification of a person, must also be classified as personal data. This may have far-reaching consequences for the practice of online advertising and data protection.
The right to treatment with artificial intelligence and access to smart medical devices – potential challenges
In recent years, the development of artificial intelligence (AI) has made considerable progress, raising hopes of a new age of automation and optimization in medicine.
With the Digital Act (DigiG) coming into force on March 26, 2024, the German healthcare system is facing a significant change. This amendment to the law aims to improve the integration and use of digital healthcare applications and optimize their reimbursement mechanisms.
The rapid advances in the field of artificial intelligence (AI) promise a new era of automation and optimization in medicine. AI systems have the potential to analyze large and complex amounts of data and identify new patterns and correlations. This makes them particularly valuable for healthcare, from anamnesis and diagnosis to therapy.
The introduction of intelligent chatbots and apps offers patients constantly available access to medical help, for example to support psychological treatments.
The potential of AI is also evident in the field of medical aids: AI-supported exoskeletons can restore mobility to people with walking disabilities, while AI-based assistance systems can predict possible complications during surgical procedures, thereby improving surgical management and saving lives.
The analysis of medical images is another area of application in which AI systems are already outperforming humans. Doctors no longer have to spend hours examining MRI and CT images for cancerous tissue, as algorithms can now do this more efficiently and precisely. Given this potential, it is not surprising that patients are increasingly demanding treatment with AI. The legal provisions of the DigiG take this development into account by facilitating access to intelligent medical devices and expanding their eligibility for reimbursement.
Is there a right to treatment with AI?
Despite the advantages of AI in medicine, the question arises as to whether patients have a right to treatment with AI. In Germany, access to healthcare is usually via GPs or specialists, the first point of contact for potential treatment with intelligent medical devices. A key issue here is the tension between the patient’s right to self-determination and the doctor’s freedom to provide treatment.
A treatment contract between doctor and patient forms the basis for every medical intervention. While patients have the right to decide on their treatment, they cannot demand a specific treatment method. Medical freedom of therapy protects doctors from having to use methods that they consider unsuitable. There is only an obligation to use new treatment methods if they are recognized as the medical standard
The integration of intelligent medical devices into statutory health insurance (SHI) depends heavily on their costs and medical benefits. The Federal Joint Committee (G-BA) plays a decisive role in the evaluation and approval of new examination and treatment methods. So far, however, there are only a few AI-supported applications that have been officially recognized.
A special legal framework exists for digital health applications (DiGA), which allows their use under certain conditions. The question of whether limited access to AI-supported treatments is unconstitutional remains open. The Federal Constitutional Court has derived fundamental rights that could oblige the state to guarantee access to necessary resources.
However, innovative technologies are not yet recognized as absolutely necessary for basic care. An exceptional case could exist if established treatment methods are lacking and an AI treatment offers a realistic chance of recovery.
The question of whether limited access to AI-assisted treatments is unconstitutional remains open. The Federal Constitutional Court has derived fundamental rights that could oblige the state to guarantee access to necessary resources. However, innovative technologies are not yet recognized as absolutely necessary for basic care. An exceptional case could exist if established treatment methods are lacking and an AI treatment offers a realistic chance of recovery.
Conclusion
Access to smart medical devices and the right to AI treatment are complex issues that pose technical, medical and legal challenges. While AI offers considerable potential in medicine, financial, regulatory and ethical hurdles still stand in the way of its widespread application. However, technological progress and social developments will play a decisive role in how these new technologies are integrated into healthcare in the future.
New Developments in Digital Health Applications Through the Digital Health Act (DigiG)
With the enactment of the Digital Health Act (DigiG) on March 26, 2024, the German healthcare system is facing significant changes. These new regulations aim to improve the integration and utilization of digital health applications (DiGAs) and optimize their reimbursement mechanisms. This article summarizes the key changes and explores the implications for medical device manufacturers.
One of the central innovations of the DigiG is the expansion of reimbursement eligibility for DiGAs to include Class IIb medical devices. Previously, only Class I and IIa products were eligible for reimbursement. This expansion now allows for the integration of more complex and potentially riskier applications into healthcare, which is justified by the gained expertise of the Federal Institute for Drugs and Medical Devices (BfArM).
Another important point concerns pregnant women, who are now also entitled to DiGAs, provided they meet the requirements of § 33a SGB V. Additionally, specific regulations have been introduced for the treatment of diabetes in structured treatment programs. This has led to the creation of the category of digital medical applications (DimAs), which allows for greater integration of digital processes into therapy.
This applies to risk class IIb, agreements and performance measurements
For DiGAs of Class IIb, stricter proof requirements now apply. The positive healthcare effect must be demonstrated through a prospective comparative study that evidences the medical benefit of the application. This is intended to strengthen the insured’s trust in more complex DiGAs and ensure their efficacy and safety.
The DigiG prohibits DiGA manufacturers from making agreements with manufacturers of pharmaceuticals or medical aids that could restrict the choice of the insured or the freedom of therapy for physicians. This measure is intended to ensure that no DiGA is designed to be used only with specific drugs or aids, which could lead to negative cost implications for statutory health insurance.
A new feature is the introduction of accompanying success measurement. This aims to create more transparency regarding the use of DiGAs and will be used for future price determination. Manufacturers are required to submit anonymized and aggregated data that includes, among other things, the duration and frequency of use and patient satisfaction.
Turning point for the system of digital health applications
A significant portion of the reimbursement amounts for DiGAs must now be performance-based, at least 20% of the total amount. This regulation affects both new and existing reimbursement agreements and could pose a challenge for manufacturers as they will need to adjust their pricing accordingly.
The Digital Health Act marks a turning point for the system of digital health applications in Germany. While it creates new opportunities and expanded applications for DiGAs, it also brings stricter requirements and new challenges for manufacturers.
In particular, the expanded reimbursement eligibility, the introduction of success measurement, and the stricter proof requirements for higher risk classes are points that manufacturers need to strategically consider from the start. It remains to be seen how these regulations will perform in practice and what long-term effects they will have on the healthcare system
Illegal health advertising: alternative practitioner praised incense in TV shopping
In the case before the Higher Regional Court of Celle on 27.02.2024, the admissibility of advertising by a naturopath promoting the effects of frankincense and curcumin against diseases such as osteoarthritis, Alzheimer’s and long Covid in a teleshopping channel was denied. Statements such as “curcumin tends to work better than diclofenac for arthritic complaints” were made to promote the food supplements.
The natural herbs mentioned are subject to the provisions of the Health Claims Regulation and the Food Information Regulation (LMIV), to which the teleshopping channel is also subject.
The operator of the channel refused to issue a cease-and-desist declaration, as the alternative practitioner allegedly only spoke about his own experiences and made no direct reference to the products.
However, the Higher Regional Court (OLG) of Celle ruled that it was irrelevant whether the alternative practitioner only spoke about his own experiences. It was more important that the viewer’s statements conveyed the impression that the products had healing properties. This violates Art. 7 para. 3 and 4 of the FIR, according to which no healing properties may be attributed to foods.
In addition, the advertising for the detox product violated the Health Claims Regulation, as health claims were made that are not permitted. The OLG ordered the channel to cease and desist in accordance with the provisions of the Unfair Competition Act (UWG) and the LMIV.
EU Data Act: New requirements for the design of connected products
The Data Act represents a significant change in data regulation driven by the GDPR. The Data Act enables the sharing and utilization of personal and non-personal data (product data pursuant to Art. 2 No. 15 DA) for the benefit of users. It affects connected, smart products as defined in Art. 2 No. 5 DA, including medical devices such as laboratory devices connected to the internet via middleware and medical devices connected via Bluetooth, provided the associated companion app is internet-enabled. However, connected services according to Art. 2 No. 6 DA, such as apps or other applications, lead to delimitation difficulties and legal uncertainties, as not every connected service communicates directly with the medical device (e.g. diary apps that prepare data graphically).
The scope of the Data Act only extends to raw data and not to analysis results generated using this data. The five major regulatory acts (DSA, DMA, DGA, AI Act, DA) are intended to generate more than 270 billion euros from IoT data and save up to 120 billion euros in the healthcare sector. This creates an area of tension between data protection law and data economy law, as the scope of the personal reference of data has a large potential overlap with machine-generated data, which falls under the Data Act. According to Art. 1 para. 5 DA, however, the GDPR should take precedence over the Data Act.
Another problem is the dysfunctional triangular relationship between the data owner (manufacturer), user and third parties (e.g. repair companies). The data owner often has a data feedback loop due to the product design, which affects its de facto control over the data generated by the user. The Data Act does not give the user a data ownership right, so the data remains a non-private, public good. Third parties could have an interest in the data in order to develop their own algorithms.
Machine-generated data is not covered by the GDPR, so the user can only assert limited claims to this data. The relationship between the manufacturer and third parties is also new, as the manufacturer could cite data protection or trade secrets as a counterargument. Primary data in the area of medical devices is particularly relevant as it includes physical, biological-chemical and regulatory indicated values as well as the number of quality controls. According to Art. 3 para. 1 and Art. 4 para. 1 of the Data Act, users should have access to this data or be given direct access to it. This could be understood as an in-situ right, i.e. a processing right in the server environment of the data controller or at least a download option that also implements the rights of data subjects under the GDPR (self-service portal).
A legislative loophole currently exists for real-time data access, subject to the technical feasibility and relevance of the data. The obligation to provide data does not apply to small and micro-enterprises in accordance with Art. 7 para. 1 GDPR.
Revision of the European Packaging Ordinance
The EU is planning a new packaging regulation to increase the reusability of packaging and thus reduce greenhouse gas emissions, as required by the Climate Protection Act. A transition period of five years is intended to ensure that medical products, in-vitro diagnostics and human and veterinary medicines are also greenhouse gas-neutral by 2050.
Currently, 40% of plastics and 50% of paper in the EU are used to produce packaging made from primary raw materials. As 36% of packaging ends up as waste and around 30 million tons of plastic are incinerated every year, this results in CO2 emissions of around 80 million tons. The main aim of the new regulation is to improve the recyclability of materials and generate secondary raw materials.
The recycling rate in Germany was 46% in 2020, and even frontrunner Lithuania only reached 56%, far below the target for 2030. The new requirements cover the entire product life cycle, although special requirements, such as the Human Medicinal Products Directive, continue to apply. Compliance with the requirements is the responsibility of manufacturers and importers.
It is important to differentiate between recyclability and recycling rate. The former is assessed on the basis of the packaging components used and certified by external testing companies. The exact criteria are to be defined by delegated acts. Articles 5-10 of the regulation define the requirements that must be verified by technical documentation. As many details will only be specified later, not all details are known at the time of entry into force.
The regulation does not explicitly prohibit the use of certain materials, but sets requirements for the sorting and reuse of certain materials. According to Article 6, all packaging must be recyclable in future in order to replace primary raw materials with secondary raw materials.
Article 7 stipulates that plastics must contain a minimum amount of recycled plastics, which can be collected in the yellow garbage can, for example. Packaging for products with a medical purpose will not have to meet the requirements until 2035 and will be given a transitional period of five years as well as a special regulation on the use of post-consumer recyclates.
With this regulation, the EU is taking an important step towards a sustainable packaging industry and climate neutrality. Manufacturers and importers are now called upon to implement the new requirements and thus make a contribution to environmental protection.
In order to avoid the impact on the environment, the polluter pays principle should be implemented in the legislative process, which is one of the most important principles in EU environmental policy (Art 191 TEU).
The Packaging Regulation will be a challenge for medical device manufacturers, as some packaging (e.g. blister packs) is not yet recyclable, and since many different primary packaging types are used, it is difficult to accurately estimate recyclability. Switching to the new packaging regulations is therefore recommended as soon as possible.
However, medical devices and pharmaceuticals benefit from the longest transition period of 12 years, but it is foreseeable that ecological packaging will become a competitive criterion. For some member states, however, the regulation represents a step backwards, as they already have more advanced systems that exceed the minimum standards of the regulation.
Exploring Artificial Intelligence and Cybersecurity in the context of the MDR
The European legal framework, including the Medical Device Regulation (“MDR”) and the upcoming regulation on AI (“AI Regulation”), places strict requirements on the safety and performance of AI-based medical devices.
The relationship between the Medical Device Regulation and the AI Regulation is characterized by supplementary certification requirements: While the MDR regulates the safety and performance of physical medical devices, the AI Regulation addresses specific risks and the data integrity of AI systems. Products that fall under both sets of regulations must fulfil the requirements of both regulations in order to be certified, which offers a double safety guarantee for both physical and software-related safety.
What is the challenge?
The challenge is to find a balance between promoting technological innovation and ensuring the necessary safety standards. The regulatory framework must be constantly evolving to meet both advances in AI technology and patient safety requirements. This requires a continuous review and adaptation of regulatory provisions in order to promote and protect both innovation and patient safety.
Quick updates from various legal and regulatory areas:
Data Protection Law
Unauthorised data queries in German hospitals
On 22 April 2024, the State Commissioner for Data Protection and the Right to Inspect Files (LDA Brandenburg) published its activity report for 2023, in which cases of unauthorised data queries in various hospitals were identified in which employees had accessed a colleague’s electronic patient file without any official reason.
These breaches of data protection were categorized as employee excesses, for which fines were imposed on the employees concerned. However, it was emphasized that a possible breach of data protection must be investigated in such cases.
Possible amendment to the German Federal Data Protection Act
The German government is planning amendments to the Federal Data Protection Act (BDSG) in order to implement agreements made in the coalition agreement and to implement the results of a BDSG evaluation. The draft bill intends to institutionalize the Data Protection Conference (DSK) in the BDSG and introduce additional paragraphs to improve the enforcement and consistency of data protection.
In future, companies and research institutions with cross-border projects could be subject to only one state data protection supervisory authority, which should avoid legal uncertainties. Other provisions concern the application of the BDSG only to data processing with a domestic connection and the revision of the regulations on video surveillance of non-public areas.
Medical Law
BVMed and VDGH draw up white paper
The German Medical Technology Association (BVMed) and the Association of the Diagnostics Industry (VDGH) have drawn up a joint position paper. The paper highlights problems such as inefficient regulatory structures and bureaucratic obstacles that have arisen as a result of the regulations.
The white paper offers concrete proposals for solutions, including the introduction of fast-track procedures for innovations, efficiency gains through the implementation of good administrative practice and harmonisation through centralisation in order to make Europe a competitive MedTech location again.
Baden-Württemberg State Social Court (LSG) on digital health applications
According to a decision of the LSG Baden-Württemberg of 3 April 2024 (Ref.: L 11 KR 579/24 ER-B), digital health applications pursuant to Section 33a (1) SGB V are medical devices of a low risk class whose main function is based on digital technologies and are intended to support the detection, monitoring, treatment or alleviation of diseases in insured persons or in the care provided by service providers. The main function of the digital health application must be characterized by digital technologies in all areas of application.
These applications must not merely serve to supplement or control other medical devices. This applies in particular if the software merely reminds the patient of the procedure and provides recommendations for adjusting the therapy, as the software does not offer any independent diagnostic or therapeutic services.
Compliance
Germany shows no improvement in the CPI index
Transparency International published the Corruption Perceptions Index (CPI) 2023 at the end of January 2024, which is based on data from 12 independent institutions. Germany scored 78 points, one point less than the previous year and the same score as ten years ago.
The organisation’s points of criticism include gaps in the fight against corruption, particularly among elected officials, shortcomings in whistleblower protection and the lack of effective corporate criminal law. Denmark, Finland, New Zealand and Norway top the ranking, while South Sudan, Syria, Venezuela and Somalia are at the bottom of the list.

