2019
Autores
Simas, EF; Prates, RM; Ramos, RP; Cardoso, JS;
Publicação
2019 27TH EUROPEAN SIGNAL PROCESSING CONFERENCE (EUSIPCO)
Abstract
The Overhead Power Distribution Lines present a wide range of insulator components, which have different shapes and types of building materials. These components are usually exposed to weather and operational conditions that may cause deviations in their shapes, colors or textures. These changes might hinder the development of automatic systems for visual inspection. In this perspective, this work presents a robust methodology for image classification, which aims at the efficient distribution insulator class identification, regardless of its degradation level. This work can be characterized by the following steps: implementation of Convolutional Neural Network (CNN); transfer learning; attribute vector acquisition and design of hybrid classifier architectures to improve the discrimination efficiency. In summary, a previously trained CNN goes through a fine tuning stage for later use as a feature extractor for training a new set of classifiers. A comparative study was conducted to identify which classifier architecture obtained the best discrimination performance for non-conforming components. The proposed methodology showed a significant improvement in classification performance, obtaining 95% overall accuracy in the identification of non-conforming component classes.
2017
Autores
Carneiro, G; Tavares, JMRS; Bradley, A; Papa, JP; Nascimento, JC; Cardoso, JS; Belagiannis, V; Lu, Z;
Publicação
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Abstract
2019
Autores
Ferreira, PM; Sequeira, AF; Pernes, D; Rebelo, A; Cardoso, JS;
Publicação
2019 INTERNATIONAL CONFERENCE OF THE BIOMETRICS SPECIAL INTEREST GROUP (BIOSIG 2019)
Abstract
Despite the high performance of current presentation attack detection (PAD) methods, the robustness to unseen attacks is still an under addressed challenge. This work approaches the problem by enforcing the learning of the bona fide presentations while making the model less dependent on the presentation attack instrument species (PAIS). The proposed model comprises an encoder, mapping from input features to latent representations, and two classifiers operating on these underlying representations: (i) the task-classifier, for predicting the class labels (as bona fide or attack); and (ii) the species-classifier, for predicting the PAIS. In the learning stage, the encoder is trained to help the task-classifier while trying to fool the species-classifier. Plus, an additional training objective enforcing the similarity of the latent distributions of different species is added leading to a 'PAIspecies'- independent model. The experimental results demonstrated that the proposed regularisation strategies equipped the neural network with increased PAD robustness. The adversarial model obtained better loss and accuracy as well as improved error rates in the detection of attack and bona fide presentations.
2019
Autores
Sousa, J; Rebelo, A; Cardoso, JS;
Publicação
Proceedings - 15th Workshop of Computer Vision, WVC 2019
Abstract
The importance of recycling is well known, either for environmental or economic reasons, it is impossible to escape it and the industry demands efficiency. Manual labour and traditional industrial sorting techniques are not capable of keeping up with the objectives demanded by the international community. Solutions based in computer vision techniques have the potential automate part of the waste handling tasks. In this paper, we propose a hierarchical deep learning approach for waste detection and classification in food trays. The proposed two-step approach retains the advantages of recent object detectors (as Faster R-CNN) and allows the classification task to be supported in higher resolution bounding boxes. Additionally, we also collect, annotate and make available to the scientific community a new dataset, named Labeled Waste in the Wild, for research and benchmark purposes. In the experimental comparison with standard deep learning approaches, the proposed hierarchical model shows better detection and classification performance. © 2019 IEEE.
2019
Autores
Carneiro, G; Manuel, J; Tavares, RS; Bradley, AP; Papa, JP; Nascimento, JC; Cardoso, JS; Lu, Z; Belagiannis, V;
Publicação
COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING-IMAGING AND VISUALIZATION
Abstract
2017
Autores
Rosado, L; Oliveira, J; Vasconcelos, MJM; da Costa, JMC; Elias, D; Cardoso, JS;
Publicação
PROCEEDINGS OF THE 10TH INTERNATIONAL JOINT CONFERENCE ON BIOMEDICAL ENGINEERING SYSTEMS AND TECHNOLOGIES, VOL 1: BIODEVICES
Abstract
Microscopic examination is currently the gold standard test for diagnosis of several neglected tropical diseases. However, reliable identification of parasitic infections requires in-depth train and access to proper equipment for subsequent microscopic analysis. These requirements are closely related with the increasing interest in the development of computer-aided diagnosis systems, and Mobile Health is starting to play an important role when it comes to health in Africa, allowing for distributed solutions that provide access to complex diagnosis even in rural areas. In this paper, we present a 3D-printed microscope that can easily be attached to a wide range of mobile devices models. To the best of our knowledge, this is the first proposed smartphone-based alternative to conventional microscopy that allows autonomous acquisition of a pre-defined number of images at 1000x magnification with suitable resolution, by using a motorized automated stage fully powered and controlled by a smartphone, without the need of manual focus of the smear slide. Reference smears slides with different parasites were used to test the device. The acquired images showed that was possible to visually detect those agents, which clearly illustrate the potential that this device can have, specially in developing countries with limited access to healthcare services.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.