Cookies
O website necessita de alguns cookies e outros recursos semelhantes para funcionar. Caso o permita, o INESC TEC irá utilizar cookies para recolher dados sobre as suas visitas, contribuindo, assim, para estatísticas agregadas que permitem melhorar o nosso serviço. Ver mais
Aceitar Rejeitar
  • Menu
Publicações

Publicações por CRAS

2014

Simulation Environment for Multi-robot Cooperative 3D Target Perception

Autores
Dias, A; Almeida, J; Dias, N; Lima, P; Silva, E;

Publicação
SIMULATION, MODELING, AND PROGRAMMING FOR AUTONOMOUS ROBOTS (SIMPAR 2014)

Abstract
Field experiments with a team of heterogeneous robots require human and hardware resources which cannot be implemented in a straightforward manner. Therefore, simulation environments are viewed by the robotic community as a powerful tool that can be used as an intermediate step to evaluate and validate the developments prior to their integration in real robots. This paper evaluates a novel multi-robot heterogeneous cooperative perception framework based on monocular measurements under the MORSE robotic simulation environment. The simulations are performed in an outdoor environment using a team of Micro Aerial Vehicles (MAV) and an Unmanned Ground Vehicle (UGV) performing distributed cooperative perception based on monocular measurements. The goal is to estimate the 3D target position.

2014

TURTLE - Systems and technologies for Deep Ocean long term presence

Autores
Ferreira, H; Martins, A; Almeida, JM; Valente, A; Figueiredo, A; da Cruz, B; Camilo, M; Lobo, V; Pinho, C; Olivier, A; Silva, E;

Publicação
2014 OCEANS - ST. JOHN'S

Abstract
This paper describes the TURTLE project that aim to develop sub-systems with the capability of deep-sea long-term presence. Our motivation is to produce new robotic ascend and descend energy efficient technologies to be incorporated in robotic vehicles used by civil and military stakeholders for underwater operations. TURTLE contribute to the sustainable presence and operations in the sea bottom. Long term presence on sea bottom, increased awareness and operation capabilities in underwater sea and in particular on benthic deeps can only be achieved through the use of advanced technologies, leading to automation of operation, reducing operational costs and increasing efficiency of human activity.

2014

Probabilistic Stereo Egomotion Transform

Autores
Silva, H; Silva, E; Bernardino, A;

Publicação
2014 IEEE INTERNATIONAL CONFERENCE ON ROBOTICS AND AUTOMATION (ICRA)

Abstract
In this paper we propose a novel fully probabilistic solution to the stereo egomotion estimation problem. We extend the notion of probabilistic correspondence to the stereo case which allow us to compute the whole 6D motion information in a probabilistic way. We compare the developed approach against other known state-of-the-art methods for stereo egomotion estimation, and the obtained results compare favorably both for the linear and angular velocities estimation.

2014

A Flow-based Motion Perception Technique for an Autonomous Robot System

Autores
Pinto, AM; Moreira, AP; Correia, MV; Costa, PG;

Publicação
JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS

Abstract
Visual motion perception from a moving observer is the most often encountered case in real life situations. It is a complex and challenging problem, although, it can promote the arising of new applications. This article presents an innovative and autonomous robotic system designed for active surveillance and a dense optical flow technique. Several optical flow techniques have been proposed for motion perception however, most of them are too computationally demanding for autonomous mobile systems. The proposed HybridTree method is able to identify the intrinsic nature of the motion by performing two consecutive operations: expectation and sensing. Descriptive properties of the image are retrieved using a tree-based scheme and during the expectation phase. In the sensing operation, the properties of image regions are used by a hybrid and hierarchical optical flow structure to estimate the flow field. The experiments prove that the proposed method extracts reliable visual motion information in a short period of time and is more suitable for applications that do not have specialized computer devices. Therefore, the HybridTree differs from other techniques since it introduces a new perspective for the motion perception computation: high level information about the image sequence is integrated into the estimation of the optical flow. In addition, it meets most of the robotic or surveillance demands and the resulting flow field is less computationally demanding comparatively to other state-of-the-art methods.

2014

Enhancing dynamic videos for surveillance and robotic applications: The robust bilateral and temporal filter

Autores
Pinto, AM; Costa, PG; Correia, MV; Moreira, AP;

Publicação
SIGNAL PROCESSING-IMAGE COMMUNICATION

Abstract
Over the last few decades, surveillance applications have been an extremely useful tool to prevent dangerous situations and to identify abnormal activities. Although, the majority of surveillance videos are often subjected to different noises that corrupt structured patterns and fine edges. This makes the image processing methods even more difficult, for instance, object detection, motion segmentation, tracking, identification and recognition of humans. This paper proposes a novel filtering technique named robust bilateral and temporal (RBLT), which resorts to a spatial and temporal evolution of sequences to conduct the filtering process while preserving relevant image information. A pixel value is estimated using a robust combination of spatial characteristics of the pixel's neighborhood and its own temporal evolution. Thus, robust statics concepts and temporal correlation between consecutive images are incorporated together which results in a reliable and configurable filter formulation that makes it possible to reconstruct highly dynamic and degraded image sequences. The filtering is evaluated using qualitative judgments and several assessment metrics, for different Gaussian and Salt Pepper noise conditions. Extensive experiments considering videos obtained by stationary and non-stationary cameras prove that the proposed technique achieves a good perceptual quality of filtering sequences corrupted with a strong noise component.

2014

Unsupervised flow-based motion analysis for an autonomous moving system

Autores
Pinto, AM; Correia, MV; Paulo Moreira, AP; Costa, PG;

Publicação
IMAGE AND VISION COMPUTING

Abstract
This article discusses the motion analysis based on dense optical flow fields and for a new generation of robotic moving systems with real-time constraints. It focuses on a surveillance scenario where an especially designed autonomous mobile robot uses a monocular camera for perceiving motion in the environment. The computational resources and the processing-time are two of the most critical aspects in robotics and therefore, two non-parametric techniques are proposed, namely, the Hybrid Hierarchical Optical Flow Segmentation and the Hybrid Density-Based Optical Flow Segmentation. Both methods are able to extract the moving objects by performing two consecutive operations: refining and collecting. During the refining phase, the flow field is decomposed in a set of clusters and based on descriptive motion properties. These properties are used in the collecting stage by a hierarchical or density-based scheme to merge the set of clusters that represent different motion models. In addition, a model selection method is introduced. This novel method analyzes the flow field and estimates the number of distinct moving objects using a Bayesian formulation. The research evaluates the performance achieved by the methods in a realistic surveillance situation. The experiments conducted proved that the proposed methods extract reliable motion information in real-time and without using specialized computers. Moreover, the resulting segmentation is less computationally demanding compared to other recent methods and therefore, they are suitable for most of the robotic or surveillance applications.

  • 114
  • 174