2017
Authors
Costa, CM; Veiga, G; Sousa, A; Nunes, S;
Publication
2017 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC)
Abstract
Teaching industrial robots by demonstration can significantly decrease the repurposing costs of assembly lines worldwide. To achieve this goal, the robot needs to detect and track each component with high accuracy. To speedup the initial object recognition phase, the learning system can gather information from assembly manuals in order to identify which parts and tools are required for assembling a new product (avoiding exhaustive search in a large model database) and if possible also extract the assembly order and spatial relation between them. This paper presents a detailed analysis of the fine tuning of the Stanford Named Entity Recognizer for this text tagging task. Starting from the recommended configuration, it was performed 91 tests targeting the main features / parameters. Each test only changed a single parameter in relation to the recommend configuration, and its goal was to see the impact of the new configuration in the precision, recall and F1 metrics. This analysis allowed to fine tune the Stanford NER system, achieving a precision of 89.91%, recall of 83.51% and F1 of 84.69%. These results were retrieved with our new manually annotated dataset containing text with assembly operations for alternators, gearboxes and engines, which were written in a language discourse that ranges from professional to informal. The dataset can also be used to evaluate other information extraction and computer vision systems, since most assembly operations have pictures and diagrams showing the necessary product parts, their assembly order and relative spatial disposition. © 2017 IEEE.
2017
Authors
Costa, V; Rossetti, R; Sousa, A;
Publication
INTERNATIONAL JOURNAL OF TECHNOLOGY AND HUMAN INTERACTION
Abstract
Interest in robotics field as a teaching tool to promote the STEM areas has grown in the past years. The search for solutions to promote robotics is a major challenge and the use of real robots always increases costs. An alternative is the use of a simulator. The construction of a simulator related with the Portuguese Autonomous Driving Competition using Gazebo as 3D simulator and ROS as a middleware connection to promote, attract, and enthusiasm university students to the mobile robotics challenges is presented. It is intended to take advantage of a competitive mindset to overcome some obstacles that appear to students when designing a real system. The proposed simulator focus on the autonomous driving competition task, such as semaphore recognition, localization, and motion control. An evaluation of the simulator is also performed, leading to an absolute error of 5.11% and a relative error of 2.76% on best case scenarios relating to the odometry tests, an accuracy of 99.37% regarding to the semaphore recognition tests, and an average error of 1.8 pixels for the FOV tests performed.
2017
Authors
Costa, V; Resende, J; Sousa, P; Sousa, A; Lau, N; Reis, L;
Publication
10TH INTERNATIONAL CONFERENCE OF EDUCATION, RESEARCH AND INNOVATION (ICERI2017)
Abstract
Autonomous Vehicles are a topic of important research, also being visually appealing to the public and attractive to educators and researchers. The autonomous driving competition in the Portuguese Robotics Open tries to take advantage of this context but concerns arise from lack of participators. Participants mention the complexity of issues related to the challenge, the space occupied for the track and the budget needed for participation. This paper takes advantage of a realistic simulator under Gazebo/ROS, studies a new track design and proposes a change in the track. The analysis presented tries to ascertain if the new design facilitates the learning process that is intended for participants while keeping visual appeal for both the general public and the participants. The proposed setup for the rules and simulator is expected to address the mentioned concerns. The rule's modification and simulator are evaluated and tested, hinting that expected learning outcomes are encouraged and the track occupied area is reduced. Learning includes mobile robotics (discrete event system and continuous control), real time artificial image vision systems (2D at image recognition and processing of real world imagery seen in 3D perspective), general real world robotics such as mechanics, control, programming, batteries, systems thinking as well as transversal skills such as team cooperation, soft skills, etc. Shown results hint that the new track and realistic simulation are promising to foster learning and hopefully attract more competing teams.
2017
Authors
Moreira, AP; Costa, P; Gonçalves, J; Faria, BM;
Publication
Lecture Notes in Electrical Engineering
Abstract
This paper describes a laboratory control theory experiment supported by the use of a DCMotor Educational Kit. The impact, as a teaching aid, of the proposed laboratory control experiment is evaluated, having in mind the student’s feedback. The DC motor that is used in the developed educational Kit is the EMG30, being a low cost 12V motor equipped with encoders and a 30:1 reduction gearbox. The experiment is based on real hardware and on simulation, using the SimTwo realistic simulation software. In order to implement the realistic simulation the EMG30model was obtained. Students’ feedback was acquired using a questionnaire and the results confirmed the importance given to these practical experiments. © Springer International Publishing Switzerland 2017.
2017
Authors
Lima, J; Pereira, AI; Costa, P; Pinto, A; Costa, P;
Publication
PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON NUMERICAL ANALYSIS AND APPLIED MATHEMATICS 2016 (ICNAAM-2016)
Abstract
This paper describes an optimization procedure for a robot with 12 degrees of freedom avoiding the inverse kinematics problem, which is a hard task for this type of robot manipulator. This robot can be used to pick and place tasks in complex designs. Combining an accurate and fast direct kinematics model with optimization strategies, it is possible to achieve the joints angles for a desired end-effector position and orientation. The optimization methods stretched simulated annealing algorithm and genetic algorithm were used. The solutions found were validated using data originated by a real and by a simulated robot formed by 12 servomotors with a gripper.
2017
Authors
Pinto, AM; Costa, PG; Correia, MV; Matos, AC; Moreira, AP;
Publication
ROBOTICS AND AUTONOMOUS SYSTEMS
Abstract
Recent advances in visual motion detection and interpretation have made possible the rising of new robotic systems for autonomous and active surveillance. In this line of research, the current work discusses motion perception by proposing a novel technique that analyzes dense flow fields and distinguishes several regions with distinct motion models. The method is called Wise Optical Flow Clustering (WOFC) and extracts the moving objects by performing two consecutive operations: evaluating and resetting. Motion properties of the flow field are retrieved and described in the evaluation phase, which provides high level information about the spatial segmentation of the flow field. During the resetting operation, these properties are combined and used to feed a guided segmentation approach. The WOFC requires information about the number of motion models and, therefore, this paper introduces a model selection method based on a Bayesian approach that balances the model's fitness and complexity. It combines the correlation of a histogram-based analysis with the decay ratio of the normalized entropy criterion. This approach interprets the flow field and gives an estimative about the number of moving objects. The experiments conducted in a realistic environment have proved that the WOFC presents several advantages that meet the requirements of common robotic and surveillance applications: is computationally efficient and provides a pixel-wise segmentation, comparatively to other state-of-the-art methods.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.