Cookies
O website necessita de alguns cookies e outros recursos semelhantes para funcionar. Caso o permita, o INESC TEC irá utilizar cookies para recolher dados sobre as suas visitas, contribuindo, assim, para estatísticas agregadas que permitem melhorar o nosso serviço. Ver mais
Aceitar Rejeitar
  • Menu
Detalhes

Detalhes

  • Nome

    José Maria Sarmento
  • Desde

    28 setembro 2021
006
Publicações

2025

From Competition to Classroom: A Hands-on Approach to Robotics Learning

Autores
Lopes, MS; Ribeiro, JD; Moreira, AP; Rocha, CD; Martins, JG; Sarmento, JM; Carvalho, JP; Costa, PG; Sousa, RB;

Publicação
2025 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS, ICARSC

Abstract
Robotics education plays a crucial role in developing STEM skills. However, university-level courses often emphasize theoretical learning, which can lead to decreased student engagement and motivation. In this paper, we tackle the challenge of providing hands-on robotics experience in higher education by adapting a mobile robot originally designed for competitions to be used in laboratory classes. Our approach integrates real-world robot operation into coursework, bridging the gap between simulation and physical implementation while maintaining accessibility. The robot's software is developed using ROS, and its effectiveness is assessed through student surveys. The results indicate that the platform increases student engagement and interest in robotics topics. Furthermore, feedback from teachers is also collected and confirmed that the platform boosts students' confidence and understanding of robotics.

2024

Fusion of Time-of-Flight Based Sensors with Monocular Cameras for a Robotic Person Follower

Autores
Sarmento, J; dos Santos, FN; Aguiar, AS; Filipe, V; Valente, A;

Publicação
JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS

Abstract
Human-robot collaboration (HRC) is becoming increasingly important in advanced production systems, such as those used in industries and agriculture. This type of collaboration can contribute to productivity increase by reducing physical strain on humans, which can lead to reduced injuries and improved morale. One crucial aspect of HRC is the ability of the robot to follow a specific human operator safely. To address this challenge, a novel methodology is proposed that employs monocular vision and ultra-wideband (UWB) transceivers to determine the relative position of a human target with respect to the robot. UWB transceivers are capable of tracking humans with UWB transceivers but exhibit a significant angular error. To reduce this error, monocular cameras with Deep Learning object detection are used to detect humans. The reduction in angular error is achieved through sensor fusion, combining the outputs of both sensors using a histogram-based filter. This filter projects and intersects the measurements from both sources onto a 2D grid. By combining UWB and monocular vision, a remarkable 66.67% reduction in angular error compared to UWB localization alone is achieved. This approach demonstrates an average processing time of 0.0183s and an average localization error of 0.14 meters when tracking a person walking at an average speed of 0.21 m/s. This novel algorithm holds promise for enabling efficient and safe human-robot collaboration, providing a valuable contribution to the field of robotics.

2024

A Robotic Framework for the Robot@Factory 4.0 Competition

Autores
Sousa, RB; Rocha, CD; Martins, JG; Costa, JP; Padrao, JT; Sarmento, JM; Carvalho, JP; Lopes, MS; Costa, PG; Moreira, AP;

Publicação
2024 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS, ICARSC

Abstract
Robotic competitions stand as platforms to propel the forefront of robotics research while nurturing STEM education, serving as hubs of both applied research and scientific innovation. In Portugal, the Portuguese Robotics Open (FNR) is an event with several robotic competitions, including the Robot@Factory 4.0 competition. This competition presents an example of deploying autonomous robots on a factory shop floor. Although the literature has works proposing frameworks for the original version of the Robot@Factory competition, none of them proposes a system framework for the Robot@Factory 4.0 version that presents the hardware, firmware, and software to complete the competition and achieve autonomous navigation. This paper proposes a complete robotic framework for the Robot@Factory 4.0 competition that is modular and open-access, enabling future participants to use and improve it in future editions. This work is the culmination of all the knowledge acquired by winning the 2022 and 2023 editions of the competition.

2024

Mission Supervisor for Food Factories Robots

Autores
Moreira, T; Santos, FN; Santos, L; Sarmento, J; Terra, F; Sousa, A;

Publicação
ROBOT 2023: SIXTH IBERIAN ROBOTICS CONFERENCE, VOL 2

Abstract
Climate change, limited natural resources, and the increase in the world's population impose society to produce food more sustainably, with lower energy and water consumption. The use of robots in agriculture is one of the most promising solutions to change the paradigm of agricultural practices. Agricultural robots should be seen as a way to make jobs easier and lighter, and also a way for people who do not have agricultural skills to produce their food. The PixelCropRobot is a low-cost, open-source robot that can perform the processes of monitoring and watering plants in small gardens. This work proposes a mission supervisor for PixelCropRobot, and general agricultural robots, and presents a prototype of user interface to this mission supervision. The communication between the mission supervisor and the other components of the system is done using ROS2 and MQTT, and mission file standardized. The mission supervisor receives a prescription map, with information about the respective mission, and decomposes them into simple tasks. An A* algorithm then defines the priority of each mission that depends on factors like water requirements, and distance travelled. This concept of mission supervisor was deployed into the PixelCropRobot and was validated in real conditions, showing a enormous potential to be extended to other agricultural robots.

2022

Collision Avoidance Considering Iterative Bezier Based Approach for Steep Slope Terrains

Autores
Santos, LC; Santos, FN; Valente, A; Sobreira, H; Sarmento, J; Petry, M;

Publicação
IEEE ACCESS

Abstract
The Agri-Food production requirements needs a more efficient and autonomous processes, and robotics will play a significant role in this process. Deploying agricultural robots on the farm is still a challenging task. Particularly in slope terrains, where it is crucial to avoid obstacles and dangerous steep slope zones. Path planning solutions may fail under several circumstances, as the appearance of a new obstacle. This work proposes a novel open-source solution called AgRobPP-CA to autonomously perform obstacle avoidance during robot navigation. AgRobPP-CA works in real-time for local obstacle avoidance, allowing small deviations, avoiding unexpected obstacles or dangerous steep slope zones, which could impose a fall of the robot. Our results demonstrated that AgRobPP-CA is capable of avoiding obstacles and high slopes in different vineyard scenarios, with low computation requirements. For example, in the last trial, AgRobPP-CA avoided a steep ramp that could impose a fall to the robot.