2024
Authors
Freitas, N; Montenegro, H; Cardoso, MJ; Cardoso, JS;
Publication
IEEE INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING, ISBI 2024
Abstract
Breast cancer locoregional treatment causes alterations to the physical aspect of the breast, often negatively impacting the self-esteem of patients unaware of the possible aesthetic outcomes of those treatments. To improve patients' self-esteem and enable a more informed choice of treatment when multiple options are available, the possibility to predict how the patient might look like after surgery would be of invaluable help. However, no work has been proposed to predict the aesthetic outcomes of breast cancer treatment. As a first step, we compare traditional computer vision and deep learning approaches to reproduce asymmetries of post-operative patients on pre-operative breast images. The results suggest that the traditional approach is better at altering the contour of the breast. In contrast, the deep learning approach succeeds in realistically altering the position and direction of the nipple.
2024
Authors
Rio-Torto, I; Gonçalves, T; Cardoso, JS; Teixeira, LF;
Publication
IEEE INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING, ISBI 2024
Abstract
In fields that rely on high-stakes decisions, such as medicine, interpretability plays a key role in promoting trust and facilitating the adoption of deep learning models by the clinical communities. In the medical image analysis domain, gradient-based class activation maps are the most widely used explanation methods and the field lacks a more in depth investigation into inherently interpretable models that focus on integrating knowledge that ensures the model is learning the correct rules. A new approach, B-cos networks, for increasing the interpretability of deep neural networks by inducing weight-input alignment during training showed promising results on natural image classification. In this work, we study the suitability of these B-cos networks to the medical domain by testing them on different use cases (skin lesions, diabetic retinopathy, cervical cytology, and chest X-rays) and conducting a thorough evaluation of several explanation quality assessment metrics. We find that, just like in natural image classification, B-cos explanations yield more localised maps, but it is not clear that they are better than other methods' explanations when considering more explanation properties.
2024
Authors
Pereira, C; Cruz, RPM; Fernandes, JND; Pinto, JR; Cardoso, JS;
Publication
IEEE Trans. Intell. Veh.
Abstract
2024
Authors
Cristino, R; Cruz, RPM; Cardoso, JS;
Publication
CoRR
Abstract
2024
Authors
Gonçalves, T; Arias, DP; Willett, J; Hoebel, KV; Cleveland, MC; Ahmed, SR; Gerstner, ER; Cramer, JK; Cardoso, JS; Bridge, CP; Kim, AE;
Publication
CoRR
Abstract
2024
Authors
Gonçalves, T; Hedström, A; Pahud de Mortanges, A; Li, X; Müller, H; Cardoso, S; Reyes, M;
Publication
Trustworthy Ai in Medical Imaging
Abstract
In the healthcare context, artificial intelligence (AI) has the potential to power decision support systems and help health professionals in their clinical decisions. However, given its complexity, AI is usually seen as a black box that receives data and outputs a prediction. This behavior may jeopardize the adoption of this technology by the healthcare community, which values the existence of explanations to justify a clinical decision. Besides, the developers must have a strategy to assess and audit these systems to ensure their reproducibility and quality in production. The field of interpretable artificial intelligence emerged to study how these algorithms work and clarify their behavior. This chapter reviews several interpretability of AI algorithms for medical imaging, discussing their functioning, limitations, benefits, applications, and evaluation strategies. The chapter concludes with considerations that might contribute to bringing these methods closer to the daily routine of healthcare professionals. © 2025 Elsevier Inc. All rights reserved.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.