2024
Authors
Amaro, M; Oliveira, HP; Pereira, T;
Publication
2024 IEEE 37TH INTERNATIONAL SYMPOSIUM ON COMPUTER-BASED MEDICAL SYSTEMS, CBMS 2024
Abstract
Lung Cancer (LC) is still among the top main causes of death worldwide, and it is the leading death number among other cancers. Several AI-based methods have been developed for the early detection of LC, trying to use Computed Tomography (CT) images to identify the initial signs of the disease. The survival prediction could help the clinicians to adequate the treatment plan and all the proceedings, by the identification of the most severe cases that need more attention. In this study, several deep learning models were compared to predict the survival of LC patients using CT images. The best performing model, a CNN with 3 layers, achieved an AUC value of 0.80, a Precision value of 0.56 and a Recall of 0.64. The obtained results showed that CT images carry information that can be used to assess the survival of LC.
2024
Authors
Teiga, I; Sousa, JV; Silva, F; Pereira, T; Oliveira, HP;
Publication
UNIVERSAL ACCESS IN HUMAN-COMPUTER INTERACTION, PT III, UAHCI 2024
Abstract
Significant medical image visualization and annotation tools, tailored for clinical users, play a crucial role in disease diagnosis and treatment. Developing algorithms for annotation assistance, particularly machine learning (ML)-based ones, can be intricate, emphasizing the need for a user-friendly graphical interface for developers. Many software tools are available to meet these requirements, but there is still room for improvement, making the research for new tools highly compelling. The envisioned tool focuses on navigating sequences of DICOM images from diverse modalities, including Magnetic Resonance Imaging (MRI), Computed Tomography (CT) scans, Ultrasound (US), and X-rays. Specific requirements involve implementing manual annotation features such as freehand drawing, copying, pasting, and modifying annotations. A scripting plugin interface is essential for running Artificial Intelligence (AI)-based models and adjusting results. Additionally, adaptable surveys complement graphical annotations with textual notes, enhancing information provision. The user evaluation results pinpointed areas for improvement, including incorporating some useful functionalities, as well as enhancements to the user interface for a more intuitive and convenient experience. Despite these suggestions, participants praised the application's simplicity and consistency, highlighting its suitability for the proposed tasks. The ability to revisit annotations ensures flexibility and ease of use in this context.
2024
Authors
Victoriano, M; Oliveira, L; Oliveira, HP;
Publication
Proceedings of the 19th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications, VISIGRAPP 2024, Volume 2: VISAPP, Rome, Italy, February 27-29, 2024.
Abstract
Climate change is causing the emergence of new pest species and diseases, threatening economies, public health, and food security. In Europe, olive groves are crucial for producing olive oil and table olives; however, the presence of the olive fruit fly (Bactrocera Oleae) poses a significant threat, causing crop losses and financial hardship. Early disease and pest detection methods are crucial for addressing this issue. This work presents a pioneering comparative performance study between two state-of-the-art object detection models, YOLOv5 and YOLOv8, for the detection of the olive fruit fly from trap images, marking the first-ever application of these models in this context. The dataset was obtained by merging two existing datasets: the DIRT dataset, collected in Greece, and the CIMO-IPB dataset, collected in Portugal. To increase its diversity and size, the dataset was augmented, and then both models were fine-tuned. A set of metrics were calculated, to assess both models performance. Early detection techniques like these can be incorporated in electronic traps, to effectively safeguard crops from the adverse impacts caused by climate change, ultimately ensuring food security and sustainable agriculture. © 2024 by SCITEPRESS – Science and Technology Publications, Lda.
2024
Authors
Pinheiro, MR; Fernandes, LE; Carneiro, IC; Carvalho, SD; Henrique, RM; Tuchin, VV; Oliveira, HP; Oliveira, LM;
Publication
JOURNAL OF BIOPHOTONICS
Abstract
With the objective of developing new methods to acquire diagnostic information, the reconstruction of the broadband absorption coefficient spectra (mu a[lambda]) of healthy and chromophobe renal cell carcinoma kidney tissues was performed. By performing a weighted sum of the absorption spectra of proteins, DNA, oxygenated, and deoxygenated hemoglobin, lipids, water, melanin, and lipofuscin, it was possible to obtain a good match of the experimental mu a(lambda) of both kidney conditions. The weights used in those reconstructions were estimated using the least squares method, and assuming a total water content of 77% in both kidney tissues, it was possible to calculate the concentrations of the other tissue components. It has been shown that with the development of cancer, the concentrations of proteins, DNA, oxygenated hemoglobin, lipids, and lipofuscin increase, and the concentration of melanin decreases. Future studies based on minimally invasive spectral measurements will allow cancer diagnosis using the proposed approach.
2024
Authors
Santos, T; Oliveira, H; Cunha, A;
Publication
COMPUTER SCIENCE REVIEW
Abstract
In recent years, the number of crimes with weapons has grown on a large scale worldwide, mainly in locations where enforcement is lacking or possessing weapons is legal. It is necessary to combat this type of criminal activity to identify criminal behavior early and allow police and law enforcement agencies immediate action.Despite the human visual structure being highly evolved and able to process images quickly and accurately if an individual watches something very similar for a long time, there is a possibility of slowness and lack of attention. In addition, large surveillance systems with numerous equipment require a surveillance team, which increases the cost of operation. There are several solutions for automatic weapon detection based on computer vision; however, these have limited performance in challenging contexts.A systematic review of the current literature on deep learning-based weapon detection was conducted to identify the methods used, the main characteristics of the existing datasets, and the main problems in the area of automatic weapon detection. The most used models were the Faster R-CNN and the YOLO architecture. The use of realistic images and synthetic data showed improved performance. Several challenges were identified in weapon detection, such as poor lighting conditions and the difficulty of small weapon detection, the last being the most prominent. Finally, some future directions are outlined with a special focus on small weapon detection.
2024
Authors
Pinheiro, MR; Carvalho, MI; Oliveira, LM;
Publication
JOURNAL OF BIOPHOTONICS
Abstract
Computer simulations, which are performed at a single wavelength at a time, have been traditionally used to estimate the optical properties of tissues. The results of these simulations need to be interpolated. For a broadband estimation of tissue optical properties, the use of computer simulations becomes time consuming and computer demanding. When spectral measurements are available for a tissue, the use of the photon diffusion approximation can be done to perform simple and direct calculations to obtain the broadband spectra of some optical properties. The additional estimation of the reduced scattering coefficient at a small number of discrete wavelengths allows to perform further calculations to obtain the spectra of other optical properties. This study used spectral measurements from the heart muscle to explain the calculation pipeline to obtain a complete set of the spectral optical properties and to show its versatility for use with other tissues for various biophotonics applications.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.