Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Publications

Publications by Ricardo Barbosa Sousa

2025

Indoor Benchmark of 3-D LiDAR SLAM at Iilab-Industry and Innovation Laboratory

Authors
Ribeiro, JD; Sousa, RB; Martins, JG; Aguiar, AS; Santos, FN; Sobreira, HM;

Publication
IEEE ACCESS

Abstract
This paper presents an indoor benchmarking study of state-of-the-art 3D LiDAR-based Simultaneous Localization and Mapping (SLAM) algorithms using the newly developed IILABS 3D - iilab Indoor LiDAR-based SLAM 3D dataset. Existing SLAM datasets often focus on outdoor environments, rely on a single type of LiDAR sensor, or lack additional sensor data such as wheel odometry in ground-based robotic platforms. Consequently, the existing datasets lack data diversity required to comprehensively evaluate performance under diverse indoor conditions. The IILABS 3D dataset fills this gap by providing a sensor-rich, indoor-exclusive dataset recorded in a controlled laboratory environment using a wheeled mobile robot platform. It includes four heterogeneous 3D LiDAR sensors - Velodyne VLP-16, Ouster OS1-64, RoboSense RS-Helios-5515, and Livox Mid-360 - featuring both mechanical spinning and non-repetitive scanning patterns, as well as an IMU and wheel odometry for sensor fusion. The dataset also contains calibration sequences, challenging benchmark trajectories, and high-precision ground-truth poses captured with a motion capture system. Using this dataset, we benchmark nine representative LiDAR-based SLAM algorithms across multiple sequences, analyzing their performance in terms of accuracy and consistency under varying sensor configurations. The results provide a comprehensive performance comparison and valuable insights into the strengths and limitations of current SLAM algorithms in indoor environments. The dataset, benchmark results, and related tools are publicly available at https://jorgedfr.github.io/3d_lidar_slam_benchmark_at_iilab/

2020

Odometry and Extrinsic Sensor Calibration on Mobile Robots

Authors
Sousa, Ricardo B.;

Publication

Abstract

2024

High-level teleoperation system for autonomous forklifts using VR over the 5G public network

Authors
Couto, Manuel B.; Petry, Marcelo; Thiago Levin; Oliveira, João; Sousa, Ricardo B.; Rebelo, Paulo; Sobreira, Heber; Silva, Manuel F.; Mendonça, Hélio; Silva, Pedro Matos; Parreira, Bruno;

Publication

Abstract

2024

PRODUTECH R3 Project Overview - From AMRs to AI and the Digital Twin

Authors
Rebelo, Paulo; Sousa, Ricardo B.; Sobreira, Heber; Caldana, Daniele; Couto, Manuel; Petry, Marcelo; Silva, Manuel F.; Ramos, Daniel; Silva, Gustavo; Duarte, Miguel; Beça, José Alberto; Silva, Pedro Matos; Fillipe Ribeiro; Mendes, Abel;

Publication

Abstract

2025

Integrated RFID System for Intralogistics Operations with Industrial Mobile Robots

Authors
Pacheco, FD; Rebelo, PM; Sousa, RB; Silva, MF; Mendonça, HS;

Publication
2025 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS, ICARSC

Abstract
Radio-Frequency IDentification (RFID) technologies automate the identification of objects and persons, having several applications in retail, manufacturing, and intralogistics sectors. Several works explore the application of RFID systems in robotics and intralogistics, focusing on locating robots, tags, and inventory management. This paper addresses the challenge of intralogistics cargo trolleys communicating their characteristics to an autonomous mobile robot through an RFID system. The robot must know the trolley's relative pose to avoid collisions with the surroundings. As a result, the passive tag on the cargo communicates information to the robot, including the base footprint of the trolley. The proposed RFID system includes the development of a controller board to interact with the frontend integrated circuit of an external antenna onboard the industrial mobile robot. Experimental results assess the system's readability distance in two distinct environments and with two different antenna modules. All the code and documentation are available in a public repository.

2025

Integrating Multimodal Perception into Ground Mobile Robots

Authors
Sousa, RB; Sobreira, HM; Martins, JG; Costa, PG; Silva, MF; Moreira, AP;

Publication
2025 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS, ICARSC

Abstract
Multimodal perception systems enhance the robustness and adaptability of autonomous mobile robots by integrating heterogeneous sensor modalities, improving long-term localisation and mapping in dynamic environments and human-robot interaction. Current mobile platforms often focus on specific sensor configurations and prioritise cost-effectiveness, possibly limiting the flexibility of the user to extend the original robots further. This paper presents a methodology to integrate multimodal perception into a ground mobile platform, incorporating wheel odometry, 2D laser scanners, 3D Light Detection and Ranging (LiDAR), and RGBD cameras. The methodology describes the electronics design to power devices, firmware, computation and networking architecture aspects, and mechanical mounting for the sensory system based on 3D printing, laser cutting, and bending metal sheet processes. Experiments demonstrate the usage of the revised platform in 2D and 3D localisation and mapping and pallet pocket estimation applications. All the documentation and designs are accessible in a public repository.

  • 2
  • 4