Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Interest
Topics
Details

Details

002
Publications

2020

Visual Trunk Detection Using Transfer Learning and a Deep Learning-based Coprocessor

Authors
Aguiar, AS; Dos Santos, FN; Miranda De Sousa, AJM; Oliveira, PM; Santos, LC;

Publication
IEEE Access

Abstract

2020

Vineyard trunk detection using deep learning – An experimental device benchmark

Authors
Pinto de Aguiar, ASP; Neves dos Santos, FBN; Feliz dos Santos, LCF; de Jesus Filipe, VMD; Miranda de Sousa, AJM;

Publication
Computers and Electronics in Agriculture

Abstract

2019

Monocular Visual Odometry Benchmarking and Turn Performance Optimization

Authors
Aguiar, A; Sousa, A; dos Santos, FN; Oliveira, M;

Publication
19th IEEE International Conference on Autonomous Robot Systems and Competitions, ICARSC 2019

Abstract
Developing ground robots for crop monitoring and harvesting in steep slope vineyards is a complex challenge due to two main reasons: harsh condition of the terrain and unstable localization accuracy obtained with Global Navigation Satellite System. In this context, a reliable localization system requires an accurate and redundant information to Global Navigation Satellite System and wheel odometry based system. To pursue this goal we benchmark 3 well known Visual Odometry methods with 2 datasets. Two of these are feature-based Visual Odometry algorithms: Libviso2 and SVO 2.0. The third is an appearance-based Visual Odometry algorithm called DSO. In monocular Visual Odometry, two main problems appear: pure rotations and scale estimation. In this paper, we focus on the first issue. To do so, we propose a Kalman Filter to fuse a single gyroscope with the output pose of monocular Visual Odometry, while estimating gyroscope's bias continuously. In this approach we propose a non-linear noise variation that ensures that bias estimation is not affected by Visual Odometry resultant rotations. We compare and discuss the three unchanged methods and the three methods with the proposed additional Kalman Filter. For tests, two public datasets are used: the Kitti dataset and another built in-house. Results show that our additional Kalman Filter highly improves Visual Odometry performance in rotation movements. © 2019 IEEE.

2019

A Version of Libviso2 for Central Dioptric Omnidirectional Cameras with a Laser-Based Scale Calculation

Authors
Aguiar, A; dos Santos, FN; Santos, L; Sousa, A;

Publication
Advances in Intelligent Systems and Computing - Robot 2019: Fourth Iberian Robotics Conference

Abstract

2019

Monocular Visual Odometry Using Fisheye Lens Cameras

Authors
Aguiar, A; dos Santos, FN; Santos, L; Sousa, A;

Publication
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Abstract
Developing ground robots for crop monitoring and harvesting in steep slope vineyards is a complex challenge due to two main reasons: harsh condition of the terrain and unstable localization accuracy obtained with Global Navigation Satellite System. In this context, a reliable localization system requires an accurate and redundant information to Global Navigation Satellite System and wheel odometry based system. To pursue this goal and have a reliable localization system in our robotic platform we aim to extract the better performance as possible from a monocular Visual Odometry method. To do so, we present a benchmark of Libviso2 using both perspective and fisheye lens cameras, studying the behavior of the method using both topologies in terms of motion performance in an outdoor environment. Also we analyze the quality of feature extraction of the method using the two camera systems studying the impact of the field of view and omnidirectional image rectification in VO. We propose a general methodology to incorporate a fisheye lens camera system into a VO method. Finally, we briefly describe the robot setup that was used to generate the results that will be presented. © 2019, Springer Nature Switzerland AG.