Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Details

Details

  • Name

    José Maria Sarmento
  • Role

    Research Assistant
  • Since

    28th September 2021
005
Publications

2024

Fusion of Time-of-Flight Based Sensors with Monocular Cameras for a Robotic Person Follower

Authors
Sarmento, J; dos Santos, FN; Aguiar, AS; Filipe, V; Valente, A;

Publication
JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS

Abstract
Human-robot collaboration (HRC) is becoming increasingly important in advanced production systems, such as those used in industries and agriculture. This type of collaboration can contribute to productivity increase by reducing physical strain on humans, which can lead to reduced injuries and improved morale. One crucial aspect of HRC is the ability of the robot to follow a specific human operator safely. To address this challenge, a novel methodology is proposed that employs monocular vision and ultra-wideband (UWB) transceivers to determine the relative position of a human target with respect to the robot. UWB transceivers are capable of tracking humans with UWB transceivers but exhibit a significant angular error. To reduce this error, monocular cameras with Deep Learning object detection are used to detect humans. The reduction in angular error is achieved through sensor fusion, combining the outputs of both sensors using a histogram-based filter. This filter projects and intersects the measurements from both sources onto a 2D grid. By combining UWB and monocular vision, a remarkable 66.67% reduction in angular error compared to UWB localization alone is achieved. This approach demonstrates an average processing time of 0.0183s and an average localization error of 0.14 meters when tracking a person walking at an average speed of 0.21 m/s. This novel algorithm holds promise for enabling efficient and safe human-robot collaboration, providing a valuable contribution to the field of robotics.

2022

Collision Avoidance Considering Iterative Bezier Based Approach for Steep Slope Terrains

Authors
Santos, LC; Santos, FN; Valente, A; Sobreira, H; Sarmento, J; Petry, M;

Publication
IEEE ACCESS

Abstract
The Agri-Food production requirements needs a more efficient and autonomous processes, and robotics will play a significant role in this process. Deploying agricultural robots on the farm is still a challenging task. Particularly in slope terrains, where it is crucial to avoid obstacles and dangerous steep slope zones. Path planning solutions may fail under several circumstances, as the appearance of a new obstacle. This work proposes a novel open-source solution called AgRobPP-CA to autonomously perform obstacle avoidance during robot navigation. AgRobPP-CA works in real-time for local obstacle avoidance, allowing small deviations, avoiding unexpected obstacles or dangerous steep slope zones, which could impose a fall of the robot. Our results demonstrated that AgRobPP-CA is capable of avoiding obstacles and high slopes in different vineyard scenarios, with low computation requirements. For example, in the last trial, AgRobPP-CA avoided a steep ramp that could impose a fall to the robot.

2022

FollowMe - A Pedestrian Following Algorithm for Agricultural Logistic Robots

Authors
Sarmento, J; Dos Santos, FN; Aguiar, AS; Sobreira, H; Regueiro, CV; Valente, A;

Publication
2022 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC)

Abstract
In Industry 4.0 and Agriculture 4.0, there are logistics areas where robots can play an important role, for example by following a person at a certain distance. These robots can transport heavy tools or simply help collect certain items, such as harvested fruits. The use of Ultra Wide Band (UWB) transceivers as range sensors is becoming very common in the field of robotics, i.e. for localising goods and machines. Since UWB technology has very accurate time resolution, it is advantageous for techniques such as Time Of Arrival (TOA), which can estimate distance by measuring the time between message frames. In this work, UWB transceivers are used as range sensors to track pedestrians/operators. In this work we propose the use of two algorithms for relative localization, between a person and robot. Both algorithms use a similar 2dimensional occupancy grid, but differ in filtering. The first is based on a Extended Kalman Filter (EKF) that fuses the range sensor with odometry. The second is based on an Histogram Filter that calculates the pedestrian position by discretizing the state space in well-defined regions. Finally, a controller is implemented to autonomously command the robot. Both approaches are tested and compared on a real differential drive robot. Both proposed solutions are able to follow a pedestrian at speeds of 0.1m/s, and are promising solutions to complement other solutions based on cameras and LiDAR.

2021

Autonomous Robot Visual-Only Guidance in Agriculture Using Vanishing Point Estimation

Authors
Sarmento, J; Aguiar, AS; dos Santos, FN; Sousa, AJ;

Publication
PROGRESS IN ARTIFICIAL INTELLIGENCE (EPIA 2021)

Abstract
Autonomous navigation in agriculture is very challenging as it usually takes place outdoors where there is rough terrain, uncontrolled natural lighting, constantly changing organic scenarios and sometimes the absence of a Global Navigation Satellite System (GNSS). In this work, a single camera and a Google coral dev Board Edge Tensor Processing Unit (TPU) setup is proposed to navigate among a woody crop, more specifically a vineyard. The guidance is provided by estimating the vanishing point and observing its position with respect to the central frame, and correcting the steering angle accordingly. The vanishing point is estimated by object detection using Deep Learning (DL) based Neural Networks (NN) to obtain the position of the trunks in the image. The NN's were trained using Transfer Learning (TL), which requires a smaller dataset than conventional training methods. For this purpose, a dataset with 4221 images was created considering image collection, annotation and augmentation procedures. Results show that our framework can detect the vanishing point with an average of the absolute error of 0.52. and can be considered for autonomous steering.

2021

Robot navigation in vineyards based on the visual vanish point concept

Authors
Sarmento, J; Aguiar, AS; Santos, FND; Sousa, AJ;

Publication
2021 International Symposium of Asian Control Association on Intelligent Robotics and Industrial Automation, IRIA 2021

Abstract
Autonomous navigation in agriculture is very challenging as it usually takes place outdoors where there is rough terrain, uncontrolled natural lighting, constantly changing organic scenarios and sometimes the absence of Global Navigation Satellite System (GNSS) signal. In this work, a monocular visual system is proposed to estimate angular orientation and navigate between woody crops, more specifically a vineyard, using a Proportional Integrative Derivative (PID)-based controller. The guidance is provided by combining two ways to find the center of the vineyard: First, by estimating the vanishing point and second, by averaging the position of the two closest base trunk detections. Then, by the monocular angle perception, the angular error is determined. For obtaining the trunk position in the image, object detection using Deep Learning (DL) based Neural Networks (NN) is used. To evaluate the proposed controller, a visual vineyard simulation is created using Gazebo. The proposed joint controller is able to travel along a simulated straight vineyard with an RMS error of 1.17 cm. Moreover, a simulated curved vineyard modeled after the Douro region is tested in this work, where the robot was able to steer with an RMS error of 7.28 cm. © 2021 IEEE.