Stefanie Speidel is a professor for “Translational Surgical Oncology” at the National Center for Tumor Diseases (NCT) Dresden since 2017 and one of the speakers of the DFG Cluster of Excellence CeTI since 2019. She received her PhD from Karlsruhe Institute of Technology (KIT) in 2009 and had a junior research group “Computer-Assisted Surgery” from 2012 – 2016 at KIT. Her current research interests include image- and robot-guided surgery, soft-tissue navigation, sensor-based surgical training and intraoperative workflow analysis based on various sensor signals in the context of the future operating room. She regularly organizes workshops and challenges including the Endoscopic Vision Challenge@MICCAI and has been general chair and program chair for a number of international events including IPCAI and MICCAI conference.
Increasingly powerful technological developments in surgery such as modern operating rooms (OR), featuring digital and interconnected as well as robotic devices provide a huge amount of valuable data which can be used to improve patient therapy. Although a lot of data is available, it is an overwhelming challenge for physicians and the surgical outcome is extremely dependent on the experience of the surgical staff.
In this talk, I’ll present our recent research regarding AI-assisted surgery with a specific focus on analysis of intraoperative data. The goal is to bridge the gap between data science, sensors and robotics to enhance the collaboration between surgeons and cyber-physical systems and to democratize surgical skills by quantifying surgical experience and make it accessible to machines. Several examples to optimize the therapy of the individual patient along the surgical treatment path are given. A focus of this talk will be synthetic data generation, intraoperative context-aware assistance as well as data-driven surgical training. Finally, remaining challenges and strategies to overcome them are discussed.
Dr. Fahrig is Head of Innovation for the Business Area Advanced Therapies at Siemens Healthcare GmbH, andProfessor, Department of Computer Science, at the Pattern Recognition Lab at Friedrich-Alexander Universitat, Erlangen, Germany. She received a BSc in physics and MSc in Medical Biophysics from the University of Toronto, and a PhD also in Medical Biophysics from the University of Western Ontario in London, Canada.
Dr. Fahrig is an expert in the design, characterization and implementation of x-ray imaging systems for diagnostic and image-guided procedures. During her PhD, she co-pioneered C-arm-based conebeam CT imaging. As faculty of the Department of Radiology at Stanford, she and her team—in collaboration with national and international clinical and scientific colleagues—developed new MR-compatible hardware, x-ray detectors, image reconstruction and correction algorithms, and protocols for clinical applications. In her current role at Siemens Healthcare she directs a group of 60 scientists designing, prototyping and testing new applications for multi-modality image guidance of minimally invasive therapies – integrating information and hiding complexity. She is serving on the Board of Directors of SPIE, is on the Editorial Board of the Journal of Medical Imaging, and is a Fellow of the AAPM and AIMBE.
One thing that the Covid-19 Pandemic has taught us is that Remote Medical Care is not only possible, but it is also necessary to ensure that we provide equal access to care for all. Today, remote care focuses on those services that can be provided without direct patient access: monitoring of the patients’ condition, and performance of preventive and control check-ups outside of medical facilities. At Advanced Therapies we are working on extending beyond ‘remote consultation’ to ‘remote delivery of patient care’. I will describe how we are integrating robotics with a Learning/Digital Interventional Suite, providing AI-based applications to enable safety-at-a-distance, support procedural workflow, enhance x-ray image quality, and maintain clear communication lines between the local team and the remote team. Clinical evidence needs and regulatory hurdles to bring these innovations to product will be described.
Inverse problems -- the estimation of hidden internal parameters of a system from measured observables -- are ubiquitous in science and medicine. A defining characteristic of these problems is that they are usually ill-posed and have no unique solution. Traditional approaches to overcome this difficulty, e.g. regularized optimization or sampling-based Bayesian inference, are unsatisfactory for various reasons. This talk will present BayesFlow, a new approach for efficient Bayesian treatment of inverse problems on the basis of invertible neural networks, i.e. networks representing bijective mappings. These networks are trained using simulations of the system under study and can then be repeatedly applied to real data to estimate full posterior probability distributions of the hidden parameters of interest. The talk will explain the basic BayesFlow methodology, demonstrate various applications in image analysis and medicine, and describes ways to verify models with regard to uncertainty calibration and model misspecification detection.
Ullrich Köthe received a diploma in Physics and a PhD in Computer Science and is now Professor for Machine Learning at Heidelberg University. He heads the group on "Explainable Machine Learning" in the CVL Lab of the Interdisciplinary Center for Scientific Computing. Explainable learning aims to open-up the blackbox of successfulmachine learning methods, in particular deep neural networks, to provide insight rather than mere numbers. To this end, out team designs powerful new algorithms on the basis of invertible neural networks and applies them to medicine, image analysis, and the natural and life sciences.