2014
Authors
dos Santos, FN; Costa, P; Moreira, AP;
Publication
2014 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC)
Abstract
Recognizing a place with a visual glance is the first capacity used by humans to understand where they are. Making this capacity available to robots will make it possible to increase the redundancy of the localization systems available in the robots, and improve semantic localization systems. However, to achieve this capacity it is necessary to build a robust visual place recognition procedure that could be used by an indoor robot. This paper presents an approach that from a single image estimates the robot location in the semantic space. This approach extracts from each camera image a global descriptor, which is the input of a Support Vector Machine classifier. In order to improve the classifier accuracy a Markov chain formalism was considered to constraint the probability flow according the place connections. This approach was tested using videos acquired from three robots in three different indoor scenarios - with and without the Markov chain filter. The use of Markov chain filter has shown a significantly improvement of the approach accuracy.
2014
Authors
Pinto, AM; Costa, PG; Moreira, AP;
Publication
2014 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC)
Abstract
This research presents an innovative mobile robotic system designed for active surveillance operations. This mobile robot moves along a rail and is equipped with a monocular camera. Thus, it enhances the surveillance capability when compared to conventional systems (mainly composed by multiple static cameras). In addition, the paper proposes a technique for multi-object tracking called MTMP (Multi-Tracking of Motion Profiles). The MTMP resorts to a formulation based on the Kalman filter and tracks several moving objects using motion profiles. A motion profile is characterized by the dominant flow vector and is computed using the optical flow signature with removal of outliers. A similarity measure based on the Mahalanobis distance is used by the MTMP for associating the moving objects over frames. The experiments conducted in realistic environments have proved that the static perception mode of the proposed robot is able to detect and track multiple moving objects in a short period of time and without using specialized computers. In addition, the MTMP exhibits a good computational performance since it takes less than 5 milliseconds to compute. Therefore, results show that the estimation of motion profiles is suitable for analyzing motion on image sequences.
2014
Authors
Goncalves, J; Batista, J; Costa, P;
Publication
2014 IEEE EMERGING TECHNOLOGY AND FACTORY AUTOMATION (ETFA)
Abstract
In this paper it is described the prototyping of an instrumented chair that allows to fully-automate the "Timed Up and Go", the "30-Second Chair Stand" and the "Hand-Force "tests assessment. The presented functional chair prototype is a low cost approach that uses inexpensive sensors and the Arduino platform as the data acquisition board, with its software developed in LabVIEW. The "Timed up and go test" consists in measuring the time spent in the task execution of standing up from a chair, walk three meters with a maximum speed without running, turn a cone and going back to the initial position. The "30-Second Chair Stand" test consists in counting the number of completed chair stands in 30 seconds. It are agility, strength and endurance tests easy to setup and execute although they lack of repeatability, whenever the measures are taken manually, due to the rough errors that are introduced. The "Hand-Force" test consists in measuring the hand strength, the relevant data are the peak and average values of several tests. The referred data is important in order to evaluate hand rehabilitation treatment results.
2014
Authors
Novais, S; Pinho, D; Bento, D; Pinto, E; Yaginuma, T; Fernandes, CS; Garcia, V; Pereira, AI; Lima, J; Mujika, M; Dias, R; Arana, S; Lima, R;
Publication
Lecture Notes in Computational Vision and Biomechanics
Abstract
In this chapter we discuss the cell-free layer (CFL) developed adjacent to the wall of microgeometries containing complex features representative of the microcirculation, such as contractions, expansions, bifurcations and confluences. The microchannels with the different geometries were made of polydimethylsiloxane (PDMS) and we use optical techniques to evaluate the cell-free layer for red blood cells (RBCs) suspensions with different hematocrit (Hct). The images are captured using a high-speed video microscopy system and the thickness of the cell-free layer was measured using both manual and automatic image analysis techniques. The results show that in in vitro microcirculation, the hematocrit and the geometrical configuration have a major impact on the CFL thickness. In particular, the thickness of the cell-free layer increases as the fluid flows through a contraction–expansion sequence and that this increase is enhanced for lower hematocrit. In contrast, the flow rates tested in these studies did not show a clear influence on the CFL thickness. © Springer Science+Business Media Dordrecht 2014.
2014
Authors
Rocha, LF; Veiga, G; Ferreira, M; Paulo Moreira, AP; Santos, V;
Publication
2014 IEEE INTERNATIONAL CONFERENCE ON AUTONOMOUS ROBOT SYSTEMS AND COMPETITIONS (ICARSC)
Abstract
Nowadays, entering in the highly competitive international market becomes a key strategy for the survive and sustained growth of enterprises in the Portuguese textile and footwear industrial sector. Thereby, to face new requirements, companies need to understand that technological innovation is a key issue. In this scenario, the research presented in this paper focuses on the development of a robot based conveyor line pick-and-place solution to perform an automatic collection of the shoe last. The solution developed consists of extracting the 3D model of the shoe last suport transported in the conveyor line and aligning it, using the Iterative Closest Point (ICP) algorithm, with a template model previously recorded. The Camera-Laser triangulation system was the approach selected to extract the 3D model. With the correct position and orientation estimation of the conveyor footwear, it will make possible to execute the pick-and-place task using an industrial manipulator. The practical implication of this work is that it contributes to improve the footwear production lines efficiency, in order to meet demands in shorter periods of time, and with high quality standards. This work was developed in partnership with the Portuguese company CEI by ZIPOR.
2014
Authors
Rocha, LF; Ferreira, M; Santos, V; Moreira, AP;
Publication
ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING
Abstract
The research work presented in this paper focuses on the development of a 3D object localization and recognition system to be used in robotics conveyor coating lines. These requirements were specified together with enterprises with small production series seeking a full robotic automation of their production line that is characterized by a wide range of products in simultaneous manufacturing. Their production process (for example heat or coating/painting treatments) limits the use of conventional identification systems attached to the object in hand. Furthermore, the mechanical structure of the conveyor introduces geometric inaccuracy in the object positioning. With the correct classification and localization of the object, the robot will be able to autonomously select the right program to execute and to perform coordinate system corrections. A cascade system performed with Support Vector Machine and the Perfect Match (point cloud geometric template matching) algorithms was developed for this purpose achieving 99.5% of accuracy. The entire recognition and pose estimation procedure is performed in a maximum time range of 3 s with standard off the shelf hardware. It is expected that this work contributes to the integration of industrial robots in highly dynamic and specialized production lines.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.