2023
Authors
Patrício, C; Teixeira, LF; Neves, JC;
Publication
CoRR
Abstract
2024
Authors
Miranda, I; Agrotis, G; Tan, RB; Teixeira, LF; Silva, W;
Publication
46th Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBC 2024, Orlando, FL, USA, July 15-19, 2024
Abstract
Breast cancer, the most prevalent cancer among women, poses a significant healthcare challenge, demanding effective early detection for optimal treatment outcomes. Mammography, the gold standard for breast cancer detection, employs low-dose X-rays to reveal tissue details, particularly cancerous masses and calcium deposits. This work focuses on evaluating the impact of incorporating anatomical knowledge to improve the performance and robustness of a breast cancer classification model. In order to achieve this, a methodology was devised to generate anatomical pseudo-labels, simulating plausible anatomical variations in cancer masses. These variations, encompassing changes in mass size and intensity, closely reflect concepts from the BI-RADs scale. Besides anatomical-based augmentation, we propose a novel loss term promoting the learning of cancer grading by our model. Experiments were conducted on publicly available datasets simulating both in-distribution and out-of-distribution scenarios to thoroughly assess the model's performance under various conditions.
2025
Authors
Patrício, C; Torto, IR; Cardoso, JS; Teixeira, LF; Neves, JC;
Publication
CoRR
Abstract
2024
Authors
Torto, IR; Cardoso, JS; Teixeira, LF;
Publication
Medical Imaging with Deep Learning, 3-5 July 2024, Paris, France.
Abstract
2024
Authors
Aubard, M; Madureira, A; Teixeira, LF; Pinto, J;
Publication
CoRR
Abstract
With the growing interest in underwater exploration and monitoring, autonomous underwater vehicles have become essential. The recent interest in onboard deep learning (DL) has advanced real-time environmental interaction capabilities relying on efficient and accurate vision-based DL models. However, the predominant use of sonar in underwater environments, characterized by limited training data and inherent noise, poses challenges to model robustness. This autonomy improvement raises safety concerns for deploying such models during underwater operations, potentially leading to hazardous situations. This article aims to provide the first comprehensive overview of sonar-based DL under the scope of robustness. It studies sonar-based DL perception task models, such as classification, object detection, segmentation, and simultaneous localization and mapping. Furthermore, this article systematizes sonar-based state-of-the-art data sets, simulators, and robustness methods, such as neural network verification, out-of-distribution, and adversarial attacks. This article highlights the lack of robustness in sonar-based DL research and suggests future research pathways, notably establishing a baseline sonar-based data set and bridging the simulation-to-reality gap. © 1976-2012 IEEE.
2024
Authors
Aubard, M; Antal, L; Madureira, A; Teixeira, LF; Ábrahám, E;
Publication
CoRR
Abstract
This paper introduces ROSAR, a novel framework enhancing the robustness of deep learning object detection models tailored for side-scan sonar (SSS) images, generated by autonomous underwater vehicles using sonar sensors. By extending our prior work on knowledge distillation (KD), this framework integrates KD with adversarial retraining to address the dual challenges of model efficiency and robustness against SSS noises. We introduce three novel, publicly available SSS datasets, capturing different sonar setups and noise conditions. We propose and formalize two SSS safety properties and utilize them to generate adversarial datasets for retraining. Through a comparative analysis of projected gradient descent (PGD) and patch-based adversarial attacks, ROSAR demonstrates significant improvements in model robustness and detection accuracy under SSS-specific conditions, enhancing the model's robustness by up to 1.85%. ROSAR is available at https://github.com/remaro-network/ROSAR-framework.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.