April 20, 2021
The joint project KIPos "AI-controlled image analysis for postoperative care", which started at the beginning of the year, received funding from the German Federal Ministry of Education and Research (BMBF) as part of the funding call "Adaptive Technologies for Society - Intelligent Interaction of Humans and Artificial Intelligence". The BMBF aims to promote innovative research and development projects in human-technology interaction that use artificial intelligence (AI) methods to optimally assist people with problem solving. Over its three-year term until January 2024, KIPos will receive funding of just under two million euros. The Fraunhofer Heinrich Hertz Institute (HHI) is involved in the joint project with its department "Vision and Imaging Technologies".
The postoperative care of cardiac surgery patients is very challenging due to the high risk of disease and mortality. Life-threatening complications can occur at any stage and must be caught early. Therefore, intensive and continuous monitoring of patients is crucial for early detection and prevention of postoperative complications.
KIPos is designed to relieve the workload of nursing and medical staff and improve the postoperative care of patients undergoing cardiac surgery. To do this, the developers use interlocking concepts of AI-based, interactive support systems. Through an AI-driven analysis of patient-related data in the ICU and monitoring ward, nurses receive predictive assistance in their monitoring and treatment processes. The interactive visualization of aggregated information in particular leads to a significant improvement in care. In addition, this allows patients and their relatives to gain a better understanding of their state of health after the procedure. Visual aids make it easy to present and communicate the recovery process in a comprehensible and transparent way.
The work of the Fraunhofer HHI researchers pursues two core objectives in the KIPos project, which will be achieved by intelligent computer vision as well as machine learning. Firstly, the team will analyze postoperative, image-based patient data to provide useful and comprehensible results to medical professionals in an automated fashion. The focus here is on two areas: On the one hand, video data (RGB, HSI) is analyzed to extract relevant vital signs from the data in a non-contact manner. On the other hand, radiological image modalities (CT, ultrasound) are evaluated in order to detect changes over time and to automatically recognize and display anatomical and histological changes compared to previous images.
The second core objective is to create a perceptual interface to the patient data, analyses and devices, which can be implemented for example on a bedside monitor. For this purpose, the researchers will capture the interaction area at the patient's bedside via a 2D camera and analyze it using computer vision and machine learning methods. The objective is to gain a better understanding of the current work situation in real time. Specifically, this should enable contactless and thus sterile and ergonomic interaction with medical devices. Furthermore, the perceptual interface is intended to proactively support employees through the contactless recognition of certain work processes, such as the leg-raise-test.
Both objectives are designed to support nursing staff and allow efficient, ergonomic and satisfactory work management while complying with important hygiene requirements. In addition, the novel concepts should allow more reliable decision-making on subsequent treatment steps and shorten response times.
Besides Fraunhofer HHI, the Digital Health Lab Duesseldorf as network coordinator, AICURA Medical and the Protestant University of Applied Sciences Ludwigsburg are also involved in the joint project.