2023
Authors
Klein, LC; Braun, J; Mendes, J; Pinto, VH; Martins, FN; de Oliveira, AS; Wortche, H; Costa, P; Lima, J;
Publication
SENSORS
Abstract
Localization is a crucial skill in mobile robotics because the robot needs to make reasonable navigation decisions to complete its mission. Many approaches exist to implement localization, but artificial intelligence can be an interesting alternative to traditional localization techniques based on model calculations. This work proposes a machine learning approach to solve the localization problem in the RobotAtFactory 4.0 competition. The idea is to obtain the relative pose of an onboard camera with respect to fiducial markers (ArUcos) and then estimate the robot pose with machine learning. The approaches were validated in a simulation. Several algorithms were tested, and the best results were obtained by using Random Forest Regressor, with an error on the millimeter scale. The proposed solution presents results as high as the analytical approach for solving the localization problem in the RobotAtFactory 4.0 scenario, with the advantage of not requiring explicit knowledge of the exact positions of the fiducial markers, as in the analytical approach.
2023
Authors
Balbín, AM; Caetano, NS; Conde Á, M; Costa, P; Felgueiras, C; Fidalgo Blanco Á; Fonseca, D; Gamazo, A; García Holgado, A; García Peñalvo, FJ; Gonçalves, J; Hernández García Á; Lima, J; Nistor, N; O’Hara, J; Olmos Migueláñez, S; Piñeiro Naval, V; Ramírez Montoya, MS; Sánchez Holgado, P; Sein Echaluce, ML;
Publication
Lecture Notes in Educational Technology
Abstract
The 10th edition of the Technological Ecosystems for Enhancing Multiculturality (TEEM 2022) brings together researchers and postgraduate students interested in combining different aspects of the technology applied to knowledge society development, with particular attention to educational and learning issues. This volume includes contributions related to communication, educational assessment, sustainable development, educational innovation, mechatronics, and learning analytics. Besides, the doctoral consortium papers close the proceedings book from a transversal perspective. © 2023, The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
2023
Authors
Chellal, AA; Braun, J; Lima, J; Goncalves, J; Costa, P;
Publication
2023 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS, ICARSC
Abstract
The aspect of energy constraint and simulation of battery behavior in robotic simulators has been partially neglected by most of the available simulation software and is offered unlimited energy instead. This lack does not reflect the importance of batteries in robots, as the battery is one of the most crucial elements. With the implementation of an adequate battery simulation, it is possible to perform a study on the energy requirements of the robot through these simulators. Thus, this paper describes a Lithium-ion battery model implemented on SimTwo robotic simulator software, in which various physical parameters such as internal resistance and capacity are modeled to mimic real-world battery behavior. The experiments and comparisons with a real robot have assessed the viability of this model. This battery simulation is intended as an additional tool for the roboticists, scientific community, researchers, and engineers to implement energy constraints in the early stages of robot design, architecture, or control.
2023
Authors
Pinto, VH; Ribeiro, FM; Brito, T; Pereira, AI; Lima, J; Costa, P;
Publication
2023 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS, ICARSC
Abstract
The robot presented in this paper was developed with the main focus on participating in robotic competitions. Therefore, the subsystems here presented were developed taking into account performance criteria instead of simplicity. Nonetheless, this paper also presents background knowledge in some basic concepts regarding robot localization, navigation, color identification and control, all of which are key for a more competitive robot.
2023
Authors
Klein, LC; Braun, J; Martins, FN; Wortche, H; de Oliveira, AS; Mendes, J; Pinto, VH; Costa, P; Lima, J;
Publication
2023 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS, ICARSC
Abstract
The use of machine learning in embedded systems is an interesting topic, especially with the growth in popularity of the Internet of Things (IoT). The capacity of a system, such as a robot, to self-localize, is a fundamental skill for its navigation and decision-making processes. This work focuses on the feasibility of using machine learning in a Raspberry Pi 4 Model B, solving the localization problem using images and fiducial markers (ArUco markers) in the context of the RobotAtFactory 4.0 competition. The approaches were validated using a realistically simulated scenario. Three algorithms were tested, and all were shown to be a good solution for a limited amount of data. Results also show that when the amount of data grows, only Multi-Layer Perception (MLP) is feasible for the embedded application due to the required training time and the resulting size of the model.
2023
Authors
Lima, J; Pinto, AF; Ribeiro, F; Pinto, M; Pereira, AI; Pinto, VH; Costa, P;
Publication
2023 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS, ICARSC
Abstract
Self-localization of a robot is one of the most important requirements in mobile robotics. There are several approaches to providing localization data. The Ultra Wide Band Time of Flight provides position information but lacks the angle. Odometry data can be combined by using a data fusion algorithm. This paper addresses the application of data fusion algorithms based on odometry and Ultra Wide Band Time of Flight positioning using a Kalman filter that allows performing the data fusion task which outputs the position and orientation of the robot. The proposed solution, validated in a real developed platform can be applied in service and industrial robots.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.