Detalhes
Nome
António ValenteCargo
Investigador SéniorDesde
01 junho 2012
Nacionalidade
PortugalCentro
Centro de Robótica Industrial e Sistemas InteligentesContactos
+351220413317
antonio.valente@inesctec.pt
2024
Autores
Ribeiro, J; Pinheiro, R; Soares, S; Valente, A; Amorim, V; Filipe, V;
Publicação
Lecture Notes in Mechanical Engineering
Abstract
The manual monitoring of refilling stations in industrial environments can lead to inefficiencies and errors, which can impact the overall performance of the production line. In this paper, we present an unsupervised detection pipeline for identifying refilling stations in industrial environments. The proposed pipeline uses a combination of image processing, pattern recognition, and deep learning techniques to detect refilling stations in visual data. We evaluate our method on a set of industrial images, and the findings demonstrate that the pipeline is reliable at detecting refilling stations. Furthermore, the proposed pipeline can automate the monitoring of refilling stations, eliminating the need for manual monitoring and thus improving industrial operations’ efficiency and responsiveness. This method is a versatile solution that can be applied to different industrial contexts without the need for labeled data or prior knowledge about the location of refilling stations. © 2024, The Author(s), under exclusive license to Springer Nature Switzerland AG.
2024
Autores
Kalbermatter, RB; Franco, T; Pereira, AI; Valente, A; Soares, SP; Lima, J;
Publicação
Communications in Computer and Information Science - Optimization, Learning Algorithms and Applications
Abstract
2023
Autores
Matos, D; Lima, J; Rohrich, R; Oliveira, A; Valente, A; Costa, P; Costa, P;
Publicação
ROBOTICS IN NATURAL SETTINGS, CLAWAR 2022
Abstract
Simulators have been increasingly used on development and tests on several areas. They allow to speed up the development without damage and no extra costs. On realistic simulators, where kinematics play an important role, the modelling process should be imported for each component to be accurately simulated. Some robots are not yet modelled, as for example the Monera. This paper presents a model of a small vibration robot (Monera) that is acquired in a developed test-bed. A localisation ground truth is used to acquire the position of the Monera with actuating it. Linear and angular speeds acquired from real experiments allow to validate the proposed methodology.
2023
Autores
Pinheiro, I; Aguiar, A; Figueiredo, A; Pinho, T; Valente, A; Santos, F;
Publicação
APPLIED SCIENCES-BASEL
Abstract
Currently, Unmanned Aerial Vehicles (UAVs) are considered in the development of various applications in agriculture, which has led to the expansion of the agricultural UAV market. However, Nano Aerial Vehicles (NAVs) are still underutilised in agriculture. NAVs are characterised by a maximum wing length of 15 centimetres and a weight of fewer than 50 g. Due to their physical characteristics, NAVs have the advantage of being able to approach and perform tasks with more precision than conventional UAVs, making them suitable for precision agriculture. This work aims to contribute to an open-source solution known as Nano Aerial Bee (NAB) to enable further research and development on the use of NAVs in an agricultural context. The purpose of NAB is to mimic and assist bees in the context of pollination. We designed this open-source solution by taking into account the existing state-of-the-art solution and the requirements of pollination activities. This paper presents the relevant background and work carried out in this area by analysing papers on the topic of NAVs. The development of this prototype is rather complex given the interactions between the different hardware components and the need to achieve autonomous flight capable of pollination. We adequately describe and discuss these challenges in this work. Besides the open-source NAB solution, we train three different versions of YOLO (YOLOv5, YOLOv7, and YOLOR) on an original dataset (Flower Detection Dataset) containing 206 images of a group of eight flowers and a public dataset (TensorFlow Flower Dataset), which must be annotated (TensorFlow Flower Detection Dataset). The results of the models trained on the Flower Detection Dataset are shown to be satisfactory, with YOLOv7 and YOLOR achieving the best performance, with 98% precision, 99% recall, and 98% F1 score. The performance of these models is evaluated using the TensorFlow Flower Detection Dataset to test their robustness. The three YOLO models are also trained on the TensorFlow Flower Detection Dataset to better understand the results. In this case, YOLOR is shown to obtain the most promising results, with 84% precision, 80% recall, and 82% F1 score. The results obtained using the Flower Detection Dataset are used for NAB guidance for the detection of the relative position in an image, which defines the NAB execute command.
2023
Autores
Matos, D; Mendes, J; Lima, J; Pereira, AI; Valente, A; Soares, S; Costa, P; Costa, P;
Publicação
ROBOTICS IN NATURAL SETTINGS, CLAWAR 2022
Abstract
Navigation is one of the most important tasks for a mobile robot and the localisation is one of its main requirements. There are several types of localisation solutions such as LiDAR, Radio-frequency and acoustic among others. The well-known line follower has been a solution used for a long time ago and still remains its application, especially in competitions for young researchers that should be captivated to the scientific and technological areas. This paper describes two methodologies to estimate the position of a robot placed on a gradient line and compares them. The Least Squares and the Machine Learning methods are used and the results applied to a real robot allow to validate the proposed approach.
Teses supervisionadas
2022
Autor
Beatriz Almeida Miranda
Instituição
UP-FEUP
2020
Autor
Isabel Maria Luís Machado
Instituição
UTAD
2020
Autor
Luís Carlos Feliz Santos
Instituição
UTAD
2018
Autor
Guilherme Moreira Aresta
Instituição
UP-FEUP
2018
Autor
Alda Almendra Henriques
Instituição
UP-FEUP
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.