Zentralbl Chir 2025; 150(S 01): S94-S95
DOI: 10.1055/s-0045-1809777
Abstracts
Innovative Technologien

Mixed Reality Visualization of AI-driven Ultrasound Imaging: Towards Dynamic Real-Time Thoracic Surgical Navigation

Authors

  • P Feodorovici

    1   University Hospital Bonn, Division of Thoracic Surgery, Department of General, Visceral, Thoracic and Vascular Surgery, Bonn, Deutschland
    2   University Hospital Bonn, Bonn Surgical Technology Center (BOSTER), Bonn, Deutschland
  • J Arensmeyer

    1   University Hospital Bonn, Division of Thoracic Surgery, Department of General, Visceral, Thoracic and Vascular Surgery, Bonn, Deutschland
    2   University Hospital Bonn, Bonn Surgical Technology Center (BOSTER), Bonn, Deutschland
  • H Bonsmann

    2   University Hospital Bonn, Bonn Surgical Technology Center (BOSTER), Bonn, Deutschland
  • D Subramani

    2   University Hospital Bonn, Bonn Surgical Technology Center (BOSTER), Bonn, Deutschland
  • A-N Vo

    2   University Hospital Bonn, Bonn Surgical Technology Center (BOSTER), Bonn, Deutschland
  • H Menghesha

    3   Helios Hospital Bonn/Rhein-Sieg, Department of Thoracic Surgery, Bonn, Deutschland
  • D Zalepugas

    1   University Hospital Bonn, Division of Thoracic Surgery, Department of General, Visceral, Thoracic and Vascular Surgery, Bonn, Deutschland
    3   Helios Hospital Bonn/Rhein-Sieg, Department of Thoracic Surgery, Bonn, Deutschland
  • P Schnorr

    3   Helios Hospital Bonn/Rhein-Sieg, Department of Thoracic Surgery, Bonn, Deutschland
  • J Schmidt

    1   University Hospital Bonn, Division of Thoracic Surgery, Department of General, Visceral, Thoracic and Vascular Surgery, Bonn, Deutschland
    2   University Hospital Bonn, Bonn Surgical Technology Center (BOSTER), Bonn, Deutschland
    3   Helios Hospital Bonn/Rhein-Sieg, Department of Thoracic Surgery, Bonn, Deutschland
 
 

    Background Thoracic surgical procedures, including pleural drainage placement, chest wall biopsy, and rib osteosynthesis, require precise spatial guidance. Ultrasound (US) imaging is used for these procedures due to its low cost, real-time capabilities, and lack of radiation exposure. However, the interpretation of US images remains challenging, mainly due to the limitations of 2D visualization, operator-dependent variability, and difficulties in accurately correlating US images with the actual patient anatomy. By projecting US-derived anatomical information directly onto the patient, the use of mixed reality (MR) could enhance surgeons' spatial orientation, improve procedural accuracy, and potentially shorten procedure times.

    Methods & Materials We developed an advanced MR platform that integrates US imaging with real-time 3D visualization. The Magic Leap 2 head-mounted display was used for holographic visualization, and a spryTrack 300 optical tracking system precisely monitored the US probe positioning. US images were acquired using a sonography device and transferred to a workstation for real-time inference of the AI segmentation model. A U-net-based model was trained to provide automated segmentation of critical thoracic anatomical structures, including ribs, pleural effusions, lung surfaces, and diaphragm. Processed segmentation data was then transferred via a WebSocket connection to the Unity application running on the Magic Leap 2, enabling accurate, real-time 3D rendering and visualization directly on the patient-analogue torso model fabricated using 3D printing and casting techniques.

    Results Tests conducted on the 3D printed and cast torso model demonstrated robust performance of the segmentation algorithm, achieving high accuracy. The real-time segmentation and rendering system performed with minimal latency (less than 50 ms on average), ensuring seamless holographic visualization. Our MR system provided clear and accurate 3D anatomical overlays, facilitating intuitive navigation and needle placement in controlled experimental settings.

    Conclusion Our preliminary evaluations on a 3D-printed torso indicate that combining US imaging with mixed reality and advanced AI-driven segmentation algorithms is technically feasible and potentially beneficial in simplifying anatomical interpretation, potentially reducing complexity and cognitive load in procedures requiring precise spatial orientation. However, these anticipated benefits remain to be validated in clinical studies employing real human anatomy.


    Publication History

    Article published online:
    25 August 2025

    © 2025. Thieme. All rights reserved.

    Georg Thieme Verlag KG
    Oswald-Hesse-Straße 50, 70469 Stuttgart, Germany