2021
Autores
Pereira, MI; Claro, RM; Leite, PN; Pinto, AM;
Publicação
IEEE ACCESS
Abstract
The automation of typically intelligent and decision-making processes in the maritime industry leads to fewer accidents and more cost-effective operations. However, there are still lots of challenges to solve until fully autonomous systems can be employed. Artificial Intelligence (AI) has played a major role in this paradigm shift and shows great potential for solving some of these challenges, such as the docking process of an autonomous vessel. This work proposes a lightweight volumetric Convolutional Neural Network (vCNN) capable of recognizing different docking-based structures using 3D data in real-time. A synthetic-to-real domain adaptation approach is also proposed to accelerate the training process of the vCNN. This approach makes it possible to greatly decrease the cost of data acquisition and the need for advanced computational resources. Extensive experiments demonstrate an accuracy of over 90% in the recognition of different docking structures, using low resolution sensors. The inference time of the system was about 120ms on average. Results obtained using a real Autonomous Surface Vehicle (ASV) demonstrated that the vCNN trained with the synthetic-to-real domain adaptation approach is suitable for maritime mobile robots. This novel AI recognition method, combined with the utilization of 3D data, contributes to an increased robustness of the docking process regarding environmental constraints, such as rain and fog, as well as insufficient lighting in nighttime operations.
2020
Autores
Claro, R; Silva, R; Pinto, A;
Publicação
GLOBAL OCEANS 2020: SINGAPORE - U.S. GULF COAST
Abstract
This paper presents an algorithm for mapping monopiles from Offshore Wind Farms (OWF). The ASV (Autonomous Surface Vehicle) surveys the environment, detects and localizes monopiles using situational awareness system based on LiDAR, GPS and IMU (Inertial Measurement Unit) data. The position of the monopile is obtained based on the relative localization between the extrapolated center of the structure that was detected and the ASV. A positive detection of a monopile is referenced to a global positioning frame based on the GPS. Results in a simulator environment demonstrate the ability of this situational awareness system to identify monopiles with a precision of 0.005 m, which is relevant for detecting structural disalignments over time that might be caused by the appearance of scour in the structure's foundation.
2025
Autores
Neves, FSP; Branco, LM; Claro, R; Pinto, AM;
Publicação
Abstract
2024
Autores
Claro, R; Neves, F; Pereira, P; Pinto, A;
Publicação
Oceans Conference Record (IEEE)
Abstract
With the expansion of offshore infrastructure, the necessity for efficient Operation and Maintenance (O&M) procedures intensifies. This article introduces DADDI, a multimodal dataset obtained from a real offshore floating structure, aimed at facilitating comprehensive inspections and 3D model creation. Leveraging Unmanned Aerial Vehicles (UAVs) equipped with advanced sensors, DADDI provides synchronized data, including visual images, thermal images, point clouds, GNSS, IMU, and odometry data. The dataset, gathered during a campaign at the ATLANTIS Coastal Testbed, offers over 2500 samples of each data type, along with intrinsic and extrinsic sensor calibrations. DADDI serves as a vital resource for the development and evaluation of algorithms, models, and technologies tailored to the inspection, monitoring, and maintenance of complex maritime structures. © 2024 IEEE.
2025
Autores
Claro, RM; Neves, FSP; Pinto, AMG;
Publicação
JOURNAL OF FIELD ROBOTICS
Abstract
The integration of precise landing capabilities into unmanned aerial vehicles (UAVs) is crucial for enabling autonomous operations, particularly in challenging environments such as the offshore scenarios. This work proposes a heterogeneous perception system that incorporates a multimodal fiducial marker, designed to improve the accuracy and robustness of autonomous landing of UAVs in both daytime and nighttime operations. This work presents ViTAL-TAPE, a visual transformer-based model, that enhance the detection reliability of the landing target and overcomes the changes in the illumination conditions and viewpoint positions, where traditional methods fail. VITAL-TAPE is an end-to-end model that combines multimodal perceptual information, including photometric and radiometric data, to detect landing targets defined by a fiducial marker with 6 degrees-of-freedom. Extensive experiments have proved the ability of VITAL-TAPE to detect fiducial markers with an error of 0.01 m. Moreover, experiments using the RAVEN UAV, designed to endure the challenging weather conditions of offshore scenarios, demonstrated that the autonomous landing technology proposed in this work achieved an accuracy up to 0.1 m. This research also presents the first successful autonomous operation of a UAV in a commercial offshore wind farm with floating foundations installed in the Atlantic Ocean. These experiments showcased the system's accuracy, resilience and robustness, resulting in a precise landing technology that extends mission capabilities of UAVs, enabling autonomous and Beyond Visual Line of Sight offshore operations.
2024
Autores
Pinto, AM; Matos, A; Marques, V; Campos, DF; Pereira, MI; Claro, R; Mikola, E; Formiga, J; El Mobachi, M; Stoker, J; Prevosto, J; Govindaraj, S; Ribas, D; Ridao, P; Aceto, L;
Publicação
Robotics and Automation Solutions for Inspection and Maintenance in Critical Infrastructures
Abstract
This chapter presents the use of Robotics in the Inspection and Maintenance of Offshore Wind as another highly challenging environment where autonomous robotics systems and digital transformations are proving high value. © 2024 Andry Maykol Pinto | Aníbal Matos | João V. Amorim Marques | Daniel Filipe Campos | Maria Inês Pereira | Rafael Claro | Eeva Mikola | João Formiga | Mohammed El Mobachi | Jaap-Jan Stoker | Jonathan Prevosto | Shashank Govindaraj | David Ribas | Pere Ridao | Luca Aceto.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.