The goal of the lab is to tailor and validate new inherently explainable AI techniques and guidelines that can be used for clinical decision-making. A further novelty is that the models to be developed will be based on considering all available data (e.g., patient-related data, imaging data) at the same time. For now, the focus is on oncology, with applications both at LUMC and partner institute Amsterdam UMC. Topics include palliative care, gynecological cancer care, and care for patients with paragangliomas (rare tumors occurring in the head and neck area).
Scientific director Peter Bosman (group leader of the Evolutionary Intelligence research group at CWI and professor at Delft University of Technology): "Clinical decision support based on AI can have great added value for physicians and patients. Modern AI models, especially deep learning models, can be very powerful, but it is not always easy to explain how they arrive at a certain prediction." Therefore, CWI and LUMC are developing new forms of AI that aim to be as explainable as possible, so that physicians and patients can understand why a certain prediction is made.
Trust
"A lack of explainability hampers the widespread use of AI for medical applications", adds scientific director Tanja Alderliesten (associate professor at the Department of Radiation Oncology of LUMC). “Having AI models that are inherently explainable builds trust. These can support well-informed shared decision-making practices regarding for example the choice of preferred treatment or follow-up procedure (i.e. frequency of monitoring). Additionally, physicians can gain new knowledge from explainable models as they can be readily inspected and understood, including for instance how certain factors together play a specific role in making predictions.”
*An ICAI (Innovation Center for Artificial Intelligence) Lab is a research collaboration between industrial, governmental, or not-for-profit partners on the one hand and knowledge institutes on the other hand. ICAI Labs focus on AI technology and house a minimum of five PhD students.
Funding for the five PhD student positions of the 'Explainable AI for Health' Lab is externally acquired via the Dutch Research Council (NWO) and the Gieskes-Strijbis Fund.