2013
Authors
Neves, R; Matos, AC;
Publication
2013 OCEANS - SAN DIEGO
Abstract
This paper presents an approach to stereovision applied to small water vehicles. By using a small low-cost computer and inexpensive off-the-shelf components, we were able to develop an autonomous driving system capable of following other vehicle and moving along paths delimited by coloured buoys. A pair of webcams was used and, with an ultrasound sensor, we were also able to implement a basic frontal obstacle avoidance system. With the help of the stereoscopic system, we inferred the position of specific objects that serve as references to the ASV guidance. The final system is capable of identifying and following targets in a distance of over 5 meters. This system was integrated with the framework already existent and shared by all the vehicles used in the OceanSys research group at INESC - DEEC/FEUP.
2013
Authors
Correia, M; Matos, A;
Publication
2013 OCEANS - SAN DIEGO
Abstract
The majority of Autonomous Underwater Vehicles (AUVs) spend most of their energy in order to propel themselves. Therefore, a good path planning technique can improve both their autonomy and range, thus their performance. This paper proposes an optimized trajectory planning methodology able to find the best possible path from a starting point to a target position, taking advantage of the water currents. In addition, the possibility of water currents changing throughout the path is contemplated and both the optimal path and currents field are updated based on the detected deviations in a predefined number of checkpoints along the path. Finally, an estimate of the vehicle's real path is performed.
2013
Authors
Vilas Boas, ER; Honório, LM; Marcato, ALM; Oliveira, EJ; Barbosa, PG; Barbosa, DA; Vilas Boas, ASCA; Cruz, NA; Matos, A; Ferreira, BM; Abreu, N; P. Moreira, A; Rocco, A; Micerino, FJ; Costa, EB; Machado, LCN;
Publication
Abstract
2013
Authors
Logghe, J; Dias, A; Almeida, J; Martins, A; Silva, E;
Publication
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Abstract
The trend to have more cooperative play and the increase of game dynamics in Robocup MSL League motivates the improvement of skills for ball passing and reception. Currently the majority of the MSL teams uses ball handling devices with rollers to have more precise kicks but limiting the capability to kick a moving ball without stopping it and grabbing it. This paper addresses the problem to receive and kick a fast moving ball without having to grab it with a roller based ball handling device. Here, the main difficulty is the high latency and low rate of the measurements of the ball sensing systems, based in vision or laser scanner sensors.Our robots use a geared leg coupled to a motor that acts simultaneously as the kicking device and low level ball sensor. This paper proposes a new method to improve the capability for ball sensing in the kicker, by combining high rate measurements from the torque and energy in the motor and angular position of the kicker leg. The developed method endows the kicker device with an effective ball detection ability, validated in several game situations like in an interception to a fast pass or when chasing the ball where the relative speed from robot to ball is low. This can be used to optimize the kick instant or by the embedded kicker control system to absorb the ball energy. © 2013 Springer-Verlag.
2013
Authors
Silva, H; Silva, E; Bernardino, A;
Publication
PROCEEDINGS OF THE 2013 13TH INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS (ROBOTICA)
Abstract
Visual Odometry is one of the most powerful, yet challenging, means of estimating robot ego-motion. By grounding perception to the static features in the environment, vision is able, in principle, to prevent the estimation bias rather common in other sensory modalities such as inertial measurement units or wheel odometers. We present a novel approach to ego-motion estimation of a mobile robot by using a 6D Visual Odometry Probabilistic Approach. Our approach exploits the complementarity of dense optical flow methods and sparse feature based methods to achieve 6D estimation of vehicle motion. A dense probabilistic method is used to robustly estimate the epipolar geometry between two consecutive stereo pairs; a sparse feature stereo approach to estimate feature depth; and an Absolute Orientation method like the Procrustes to estimate the global scale factor. We tested our proposed method on a known dataset and compared our 6D Visual Odometry Probabilistic Approach without filtering techniques against a implementation that uses the well known 5-point RANSAC algorithm. Moreover, comparison with an Inertial Measurement Unit (RTK-GPS) is also performed, for providing a more detailed evaluation of the method against ground-truth information.
2013
Authors
Almeida, J; Dias, A; Martins, A; Sequeira, J; Silva, E;
Publication
INTERNATIONAL JOURNAL OF ADVANCED ROBOTIC SYSTEMS
Abstract
This work addresses the problem of traction control in mobile wheeled robots in the particular case of the RoboCup Middle Size League (MSL). The slip control problem is formulated using simple friction models for ISePorto Team Robots with a differential wheel configuration. Traction was also characterized experimentally in the MSL scenario for relevant game events. This work proposes a hierarchical traction control architecture which relies on local slip detection and control at each wheel, with relevant information being relayed to a higher level responsible for global robot motion control. A dedicated one axis control embedded hardware subsystem allowing complex local control, high frequency current sensing and odometric information procession was developed. This local axis control board is integrated in a distributed system using CAN bus communications. The slipping observer was implemented in the axis control hardware nodes integrated in the ISePorto Robots and was used to control and detect loss of traction. An external vision system was used to perform a qualitative analysis of the slip detection and observer performance results are presented.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.