AI in hospitals: New EU regulation places responsibility on operators

The EU AI Regulation 2024/1689 introduces binding obligations for the first time for a wide range of economic operators when using AI systems. The scope of application is defined in Article 2 of the Regulation and covers not only manufacturers, but also operators, importers, distributors and other actors along the entire value chain. This means that hospitals and other healthcare facilities that use AI systems are also particularly affected.

What previously mainly concerned technical requirements is now expanding into a comprehensive catalogue of obligations with ethical, clinical and liability dimensions. This requires a new risk culture in everyday clinical practice and a broader understanding of responsibility when dealing with AI.

The most important duties at a glance

  • Appointment of a supervisor: Each institution must appoint a responsible person with technical, clinical, and ethical expertise. This person has the right and duty to shut down systems in case of doubt (“emergency stop”).
  • Technical and organizational measures: A risk management system is mandatory: risks must be monitored, documented, minimized, and addressed immediately if necessary.
  • Transparency and traceability: AI decisions must be traceable for specialist staff, even in black box models. Employees must be trained and involved in critical reflection.
  • Documentation requirements: Errors, interventions, or deviations must be recorded in full and reported if necessary.
  • Qualification and training: Employees must demonstrate their “AI competence.” Regular training and continuing education will become mandatory.
  • Patient information: Patients have the right to know when AI plays a role in the treatment process. They may challenge decisions and demand human oversight.

Focus on black box and adaptive systems

The regulation takes a particularly critical view of systems whose functioning is not readily comprehensible. Even if AI operates as a black box, hospitals must ensure that traceability and control are in place. Adaptive systems that develop independently require even more attention. These require not only ongoing technical reviews, but also regular retraining of staff to ensure that clinical decisions continue to be made safely and responsibly.

Impact on hospitals

For healthcare facilities, this means extensive adjustments. The quality management system must be expanded to integrate new processes for monitoring and risk analysis. At the same time, cooperation with data protection, IT security, and ethics committees will become even closer. It is no longer conceivable for individual departments to go it alone.

The liability risk is also particularly relevant: failures in supervision or documentation may be considered malpractice in the future. For everyday hospital life, this also means establishing a new culture of discussion. AI must never be allowed to “decide” unsupervised, but must always remain a tool under human responsibility.

KI im Krankenhaus

Scope for action in practice

For hospitals, this means that they must first review all AI systems in use and classify them according to risk categories. Based on this, it is advisable to set up an interdisciplinary committee that combines oversight, ethics, and IT security. This is the only way to consistently answer technical, legal, and clinical questions.

It is equally important to get employees on board: training and continuing education will become an integral part of everyday hospital life. At the same time, a culture must be created in which mistakes are openly documented and reflected upon instead of being swept under the rug. Finally, communication with patients must not be neglected. They have a right to know when AI is involved in their treatment and to request a human review if they wish.

Conclusion: Rediscovering responsibility

EU AI Regulation 2024/1689 marks a turning point. It shifts the focus from technical innovation to responsible, human-centered use. For hospitals, this is both an opportunity and an obligation: they must adapt their structures and processes to meet liability, transparency, and ethical requirements in equal measure. The path to achieving this is challenging, but it will strengthen the trust of both patients and medical staff in the long term.