2020
Authors
Macedo, Jd; Aloísio, J; Gonçalves, N; Pereira, R; Saraiva, J;
Publication
ASE Workshops
Abstract
2020
Authors
Gerson Pech; Catarina Delgado;
Publication
Abstract
2020
Authors
Pereira, MA; Machete, IF; Ferreira, DC; Marques, RC;
Publication
SOCIO-ECONOMIC PLANNING SCIENCES
Abstract
Health is one of the most fundamental human rights. In that sense, the creation of health systems attempted to provide the population with organisations, institutions, and resources to meet their needs. However, health inequalities are prevalent in all countries. Thus, evaluating health systems is vital to understand this issue. Accordingly, the aim of this work is to develop a multi-criteria decision analysis (MCDA) approach to innovatively rank nine of the European health systems with Beveridgian financing that would help determine the shortcomings of the Portuguese National Health Service and discern the operating "best practices", in a close interaction with the Portuguese Ministry of Health. First, the panel of decision-making actors was guided through the design of a cognitive map to promote their learning process and help them identify eleven fundamental points of view. Second, their operationalisation was facilitated by selecting acceptable descriptors of performance. Finally, an MCDA approach was proposed to evaluate the chosen health systems using an additive model, which included a sensitivity and a robustness analysis. In the end, the model was perceived by the panel as being trustworthy and reliable. This framework can be used for further MCDA modelling in similar applications based on participatory procedures.
2020
Authors
Almeida, R; Pinho, B; Jacome, C; Teixeira, JF; Amaral, R; Goncalves, I; Lopes, F; Pinheiro, AC; Jacinto, T; Paixao, C; Pereira, M; Marques, A; Fonseca, JA;
Publication
XV MEDITERRANEAN CONFERENCE ON MEDICAL AND BIOLOGICAL ENGINEERING AND COMPUTING - MEDICON 2019
Abstract
Evaluation of lung function is central to the management of chronic obstructive respiratory diseases. It is typically evaluated with a spirometer by a specialized health professional, who ensures the correct execution of a forced expiratory manoeuvre (FEM). Audio recording of a FEM using a smart device embedded microphone can be used to self-monitor lung function between clinical visits. The challenge of microphone spirometry is to ensure the validity and reliability of the FEM, in the absence of a health professional. In particular, the absence of a mouthpiece may allow excessive mouth closure, leading to an incorrect manoeuvre. In this work, a strategy to automatically assess the correct execution of the FEM is proposed and validated. Using 498 FEM recordings, both specificity and sensitivity attained were above 90%. This method provides immediate feedback to the user, by grading the manoeuvre in a visual scale, promoting the repetition of the FEM when needed.
2020
Authors
de Aguiar, ASP; dos Santos, FBN; dos Santos, LCF; Filipe, VMD; de Sousa, AJM;
Publication
COMPUTERS AND ELECTRONICS IN AGRICULTURE
Abstract
Research and development in mobile robotics are continuously growing. The ability of a human-made machine to navigate safely in a given environment is a challenging task. In agricultural environments, robot navigation can achieve high levels of complexity due to the harsh conditions that they present. Thus, the presence of a reliable map where the robot can localize itself is crucial, and feature extraction becomes a vital step of the navigation process. In this work, the feature extraction issue in the vineyard context is solved using Deep Learning to detect high-level features - the vine trunks. An experimental performance benchmark between two devices is performed: NVIDIA's Jetson Nano and Google's USB Accelerator. Several models were retrained and deployed on both devices, using a Transfer Learning approach. Specifically, MobileNets, Inception, and lite version of You Only Look Once are used to detect vine trunks in real-time. The models were retrained in a built in-house dataset, that is publicly available. The training dataset contains approximately 1600 annotated vine trunks in 336 different images. Results show that NVIDIA's Jetson Nano provides compatibility with a wider variety of Deep Learning architectures, while Google's USB Accelerator is limited to a unique family of architectures to perform object detection. On the other hand, the Google device showed an overall Average precision higher than Jetson Nano, with a better runtime performance. The best result obtained in this work was an average precision of 52.98% with a runtime performance of 23.14 ms per image, for MobileNet-V2. Recent experiments showed that the detectors are suitable for the use in the Localization and Mapping context.
2020
Authors
Silva, W; Pollinger, A; Cardoso, JS; Reyes, M;
Publication
MICCAI (1)
Abstract
When encountering a dubious diagnostic case, radiologists typically search in public or internal databases for similar cases that would help them in their decision-making process. This search represents a massive burden to their workflow, as it considerably reduces their time to diagnose new cases. It is, therefore, of utter importance to replace this manual intensive search with an automatic content-based image retrieval system. However, general content-based image retrieval systems are often not helpful in the context of medical imaging since they do not consider the fact that relevant information in medical images is typically spatially constricted. In this work, we explore the use of interpretability methods to localize relevant regions of images, leading to more focused feature representations, and, therefore, to improved medical image retrieval. As a proof-of-concept, experiments were conducted using a publicly available Chest X-ray dataset, with results showing that the proposed interpretability-guided image retrieval translates better the similarity measure of an experienced radiologist than state-of-the-art image retrieval methods. Furthermore, it also improves the class-consistency of top retrieved results, and enhances the interpretability of the whole system, by accompanying the retrieval with visual explanations.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.