Details
Name
Joana Vale SousaRole
Research AssistantSince
01st December 2020
Nationality
PortugalCentre
Telecommunications and MultimediaContacts
+351222094000
joana.v.sousa@inesctec.pt
2025
Authors
Sousa, JV; Oliveira, HP; Pereira, T;
Publication
2025 IEEE 25th International Conference on Bioinformatics and Bioengineering (BIBE)
Abstract
2025
Authors
Amaro, M; Sousa, JV; Gouveia, M; Oliveira, HP; Pereira, T;
Publication
Measurement and Evaluations in Cancer Care
Abstract
2025
Authors
Freire, AM; Rodrigues, EM; Sousa, JV; Gouveia, M; Ferreira-Santos, D; Pereira, T; Oliveira, HP; Sousa, P; Silva, AC; Fernandes, MS; Hespanhol, V; Araújo, J;
Publication
UNIVERSAL ACCESS IN HUMAN-COMPUTER INTERACTION, UAHCI 2025, PT I
Abstract
Lung cancer remains one of the most common and lethal forms of cancer, with approximately 1.8 million deaths annually, often diagnosed at advanced stages. Early detection is crucial, but it depends on physicians' accurate interpretation of computed tomography (CT) scans, a process susceptible to human limitations and variability. ByMe has developed a medical image annotation and anonymization tool designed to address these challenges through a human-centered approach. The tool enables physicians to seamlessly add structured attribute-based annotations (e.g., size, location, morphology) directly within their established workflows, ensuring intuitive interaction.Integrated with Picture Archiving and Communication Systems (PACS), the tool streamlines the annotation process and enhances usability by offering a dedicated worklist for retrospective and prospective case analysis. Robust anonymization features ensure compliance with privacy regulations such as the General Data Protection Regulation (GDPR), enabling secure dataset sharing for research and developing artificial intelligence (AI) models. Designed to empower AI integration, the tool not only facilitates the creation of high-quality datasets but also lays the foundation for incorporating AI-driven insights directly into clinical workflows. Focusing on usability, workflow integration, and privacy, this innovation bridges the gap between precision medicine and advanced technology. By providing the means to develop and train AI models for lung cancer detection, it holds the potential to significantly accelerate diagnosis as well as enhance its accuracy and consistency.
2024
Authors
Teiga, I; Sousa, JV; Silva, F; Pereira, T; Oliveira, HP;
Publication
UNIVERSAL ACCESS IN HUMAN-COMPUTER INTERACTION, PT III, UAHCI 2024
Abstract
Significant medical image visualization and annotation tools, tailored for clinical users, play a crucial role in disease diagnosis and treatment. Developing algorithms for annotation assistance, particularly machine learning (ML)-based ones, can be intricate, emphasizing the need for a user-friendly graphical interface for developers. Many software tools are available to meet these requirements, but there is still room for improvement, making the research for new tools highly compelling. The envisioned tool focuses on navigating sequences of DICOM images from diverse modalities, including Magnetic Resonance Imaging (MRI), Computed Tomography (CT) scans, Ultrasound (US), and X-rays. Specific requirements involve implementing manual annotation features such as freehand drawing, copying, pasting, and modifying annotations. A scripting plugin interface is essential for running Artificial Intelligence (AI)-based models and adjusting results. Additionally, adaptable surveys complement graphical annotations with textual notes, enhancing information provision. The user evaluation results pinpointed areas for improvement, including incorporating some useful functionalities, as well as enhancements to the user interface for a more intuitive and convenient experience. Despite these suggestions, participants praised the application's simplicity and consistency, highlighting its suitability for the proposed tasks. The ability to revisit annotations ensures flexibility and ease of use in this context.
2023
Authors
Freitas, P; Silva, F; Sousa, JV; Ferreira, RM; Figueiredo, C; Pereira, T; Oliveira, HP;
Publication
SCIENTIFIC REPORTS
Abstract
Emerging evidence of the relationship between the microbiome composition and the development of numerous diseases, including cancer, has led to an increasing interest in the study of the human microbiome. Technological breakthroughs regarding DNA sequencing methods propelled microbiome studies with a large number of samples, which called for the necessity of more sophisticated data-analytical tools to analyze this complex relationship. The aim of this work was to develop a machine learning-based approach to distinguish the type of cancer based on the analysis of the tissue-specific microbial information, assessing the human microbiome as valuable predictive information for cancer identification. For this purpose, Random Forest algorithms were trained for the classification of five types of cancer-head and neck, esophageal, stomach, colon, and rectum cancers-with samples provided by The Cancer Microbiome Atlas database. One versus all and multi-class classification studies were conducted to evaluate the discriminative capability of the microbial data across increasing levels of cancer site specificity, with results showing a progressive rise in difficulty for accurate sample classification. Random Forest models achieved promising performances when predicting head and neck, stomach, and colon cancer cases, with the latter returning accuracy scores above 90% across the different studies conducted. However, there was also an increased difficulty when discriminating esophageal and rectum cancers, failing to differentiate with adequate results rectum from colon cancer cases, and esophageal from head and neck and stomach cancers. These results point to the fact that anatomically adjacent cancers can be more complex to identify due to microbial similarities. Despite the limitations, microbiome data analysis using machine learning may advance novel strategies to improve cancer detection and prevention, and decrease disease burden.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.