2022
Authors
Magalhães, SC; Moreira, AP; Costa, P;
Publication
CoRR
Abstract
2023
Authors
Klein, LC; Braun, J; Martins, FN; Wortche, H; de Oliveira, AS; Mendes, J; Pinto, VH; Costa, P; Lima, J;
Publication
2023 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS, ICARSC
Abstract
The use of machine learning in embedded systems is an interesting topic, especially with the growth in popularity of the Internet of Things (IoT). The capacity of a system, such as a robot, to self-localize, is a fundamental skill for its navigation and decision-making processes. This work focuses on the feasibility of using machine learning in a Raspberry Pi 4 Model B, solving the localization problem using images and fiducial markers (ArUco markers) in the context of the RobotAtFactory 4.0 competition. The approaches were validated using a realistically simulated scenario. Three algorithms were tested, and all were shown to be a good solution for a limited amount of data. Results also show that when the amount of data grows, only Multi-Layer Perception (MLP) is feasible for the embedded application due to the required training time and the resulting size of the model.
2023
Authors
Lima, J; Pinto, AF; Ribeiro, F; Pinto, M; Pereira, AI; Pinto, VH; Costa, P;
Publication
2023 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS, ICARSC
Abstract
Self-localization of a robot is one of the most important requirements in mobile robotics. There are several approaches to providing localization data. The Ultra Wide Band Time of Flight provides position information but lacks the angle. Odometry data can be combined by using a data fusion algorithm. This paper addresses the application of data fusion algorithms based on odometry and Ultra Wide Band Time of Flight positioning using a Kalman filter that allows performing the data fusion task which outputs the position and orientation of the robot. The proposed solution, validated in a real developed platform can be applied in service and industrial robots.
2023
Authors
Lima, J; Brito, T; Ferreira, O; Afonso, J; Pinto, H; Carvalho, A; Costa, P;
Publication
International Conference on Electrical, Computer, Communications and Mechatronics Engineering, ICECCME 2023
Abstract
This paper presents the development of an acquisition system and data logger from an existing set of three continuous stirred-tank reactors in series. The reactors are currently used in chemical engineering educational laboratories to perform kinetic and tracer experiments. In this sense, to accomplish the store data process, the volumetric flow rate and the concentration of tracer, reactants and/or products of the reaction must be acquired as a function of time. In the original experimental setup, only the signal conditioning system was operational, while the acquisition, visualization, and control systems were obsolete and damaged. Thus, a new system composed of an interface and real-time acquisition data is proposed alongside preserving the main reactor structure. A graphical user interface and the automation of the various actuators were developed based on worldwide usage and low cost, respectively. This system, based on a common 8-bit microcontroller and an application developed in Lazarus, allows the storage of the acquired data in a time-series database. In this way, students can analyze the results later or in real time. Moreover, remote access allows controlling the reactor and getting data by the Internet of Things (IoT) resources. Additionally, the proposed system using IoT allows data to be shared with the community as a dataset. © 2023 IEEE.
2023
Authors
Alvarez, M; Brancalião, L; Carneiro, J; Costa, P; Coelho, JP; Gonçalves, J;
Publication
28th IEEE International Conference on Emerging Technologies and Factory Automation, ETFA 2023, Sinaia, Romania, September 12-15, 2023
Abstract
This paper is devoted to present the most recent results regarding the ongoing work carried out in the scope of the STC 4.0 HP project, which aims to automate the finishing process of ceramic tableware at the GRESTEL S.A. industry, focusing on non-circular shaped plates. A collaborative robot is in charge of handling the tableware and making it go around its entire perimeter through a sponge, to perform the finishing. An array, with the distances from the center to the different points of the plate, is applied as data to trace the path that the robot must follow. The final goal of this prototype is to obtain an even finish while maintaining a constant force along the entire perimeter of the ceramic tableware. After carrying out a series of tests, it was possible to conclude that the current approach was able to manipulate 3D-printed tableware made for testing and travel its perimeter to carry out the finishing.
2024
Authors
Brito, T; Pereira, AI; Costa, P; Lima, J;
Publication
OPTIMIZATION, LEARNING ALGORITHMS AND APPLICATIONS, PT II, OL2A 2023
Abstract
Worldwide, forests have been harassed by fire in recent years. Either by human intervention or other reasons, the history of the burned area is increasing considerably, harming fauna and flora. It is essential to detect an early ignition for fire-fighting authorities can act quickly, decreasing the impact of forest damage impacts. The proposed system aims to improve nature monitoring and improve the existing surveillance systems through satellite image recognition. The soil recognition via satellite images can determine the sensor modules' best position and provide crucial input information for artificial intelligence-based systems. For this, satellite images from the Sentinel-2 program are used to generate forest density maps as updated as possible. Four classification algorithms make the Tree Cover Density (TCD) map, consisting of the Gaussian Mixture Model (GMM), Random Forest (RF), Support Vector Machine (SVM), and K-Nearest Neighbors (K-NN), which identify zones by training known regions. The results demonstrate a comparison between the algorithms through their performance in recognizing the forest, grass, pavement, and water areas by Sentinel-2 images.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.