Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Publications

Publications by Hugo Miguel Silva

2021

Hyperspectral Imaging System for Marine Litter Detection

Authors
Freitas, S; Silva, H; Almeida, C; Viegas, D; Amaral, A; Santos, T; Dias, A; Jorge, PAS; Pham, CK; Moutinho, J; Silva, E;

Publication
OCEANS 2021: SAN DIEGO - PORTO

Abstract
This work addresses the use of hyperspectral imaging systems for remote detection of marine litter concentrations in oceanic environments. The work consisted on mounting an off-the-shelf hyperspectral imaging system (400-2500 nm) in two aerial platforms: manned and unmanned, and performing data acquisition to develop AI methods capable of detecting marine litter concentrations at the water surface. We performed the campaigns at Porto Pim Bay, Fail Island, Azores, resorting to artificial targets built using marine litter samples. During this work, we also developed a Convolutional Neural Network (CNN-3D), using spatial and spectral information to evaluate deep learning methods to detect marine litter in an automated manner. Results show over 84% overall accuracy (OA) in the detection and classification of the different types of marine litter samples present in the artificial targets.

2021

COLLECTION AND LIFE SUPPORT IN A HYPERBARIC SYSTEM FOR DEEP-SEA ORGANISMS

Authors
Viegas, D; Figueiredo, A; Coimbra, J; Dos Santos, A; Almeida, J; Dias, N; Lima, L; Silva, H; Ferreira, H; Almeida, C; Amaro, T; Arenas, F; Castro, F; Santos, M; Martins, A; Silva, E;

Publication
OCEANS 2021: SAN DIEGO - PORTO

Abstract
This paper presents the development of a hyperbaric system able to collect, transport and maintain deep-sea species in controlled condition from the sea floor up to the surface (HiperSea System). The system is composed by two chambers coupled with a transference set-up. The first chamber is able to reach a maximum of 1km depth collecting both benthic and pelagic deep-sea species. The second chamber is a life support compartment to maintain the specimens alive at the surface, in hyperbaric conditions.

2022

Hyperspectral Imaging Zero-Shot Learning for Remote Marine Litter Detection and Classification

Authors
Freitas, S; Silva, H; Silva, E;

Publication
REMOTE SENSING

Abstract
This paper addresses the development of a novel zero-shot learning method for remote marine litter hyperspectral imaging data classification. The work consisted of using an airborne acquired marine litter hyperspectral imaging dataset that contains data about different plastic targets and other materials and assessing the viability of detecting and classifying plastic materials without knowing their exact spectral response in an unsupervised manner. The classification of the marine litter samples was divided into known and unknown classes, i.e., classes that were hidden from the dataset during the training phase. The obtained results show a marine litter automated detection for all the classes, including (in the worst case of an unknown class) a precision rate over 56% and an overall accuracy of 98.71%.

2022

Feedfirst: Intelligent monitoring system for indoor aquaculture tanks

Authors
Teixeira, B; Lima, AP; Pinho, C; Viegas, D; Dias, N; Silva, H; Almeida, J;

Publication
2022 OCEANS HAMPTON ROADS

Abstract
The Feedfirst Intelligent Monitoring System is a novel tool for intelligent monitoring of fish nurseries in aquaculture scenarios, mainly focusing on monitoring three essential items: water quality control, biomass estimation, and automated feeding. The system is based on machine vision techniques for fish larvae population size detection, and larvae biomass estimation is monitored through size measurement. We also show that the perception-actuation loop in automated fish tanks can be closed by using the vision system output to influence feeding procedures. The proposed solution was tested in a real tank in an aquaculture setting with real-time performance and logging capabilities.

2007

A Real Time Vision System for Autonomous Systems: Characterization during a Middle Size Match

Authors
Silva, H; Almeida, JM; Lima, L; Martins, A; da Silva, EP;

Publication
RoboCup 2007: Robot Soccer World Cup XI, July 9-10, 2007, Atlanta, GA, USA

Abstract

2007

LSAVISION a framework for real time vision mobile robotics

Authors
Silva, H; Almeida, JM; Lima, L; Martins, A; Silva, EP; Patacho, A;

Publication
COMPUTATIONAL MODELLING OF OBJECTS REPRESENTED IN IMAGES: FUNDAMENTALS, METHODS AND APPLICATIONS

Abstract
This paper propose a real-time vision architecture for mobile robotics, and describes a current implementation that is characterised by: low computational cost, low latency, low power, high modularity, configuration, adaptability and scalability. A pipeline structure further reduces latency and allows a paralleled hardware implementation. A dedicated hardware vision sensor was developed in order to take advantage of the proposed architecture. A new method using run length encoding (RLE) colour transition allows real-time edge determination at low computational cost. The real-time characteristics and hardware partial implementation, coupled with low energy consumption address typical autonomous systems applications.

  • 5
  • 6