Cookies
O website necessita de alguns cookies e outros recursos semelhantes para funcionar. Caso o permita, o INESC TEC irá utilizar cookies para recolher dados sobre as suas visitas, contribuindo, assim, para estatísticas agregadas que permitem melhorar o nosso serviço. Ver mais
Aceitar Rejeitar
  • Menu
Publicações

Publicações por Rui Silva Nóbrega

2015

Balance Assessment in Fall-Prevention Oriented Exergames

Autores
Brito, M; Jacob, J; Nobrega, R; Santos, A;

Publicação
ASSETS'15: PROCEEDINGS OF THE 17TH INTERNATIONAL ACM SIGACCESS CONFERENCE ON COMPUTERS & ACCESSIBILITY

Abstract
To assess the success of fall-prevention oriented exergames two digital games were developed taking advantage of the Wii Balance Board (WBB) capabilities. The objective was to evaluate the exergames' potential to address elderly adults' declining motivation to exercise regularly despite the benefits related to fall-prevention. The system uses the WBB to keep track of the player's center of pressure and computes important balance assessment measures related to it to eventually provide a mean to monitor their patients. The presented demo will feature the two exergames.

2017

Adaptivity and Safety in Location-Based Games

Autores
Jacob, J; Nobrega, R; Coelho, A; Rodrigues, R;

Publicação
2017 9TH INTERNATIONAL CONFERENCE ON VIRTUAL WORLDS AND GAMES FOR SERIOUS APPLICATIONS (VS-GAMES)

Abstract
Location-based games require, among other things, physical activity and real-world context. Additionally, ensuring that the players are assigned challenges that are adequate and safe for the current context (both physical and spatial) is also important, as it can improve both the gaming experience and the outcomes of the exercise. However, the impact adaptivity has in the specific case of location-based exergames still has not been researched in depth. In this paper, we present a location-based exergame capable of adapting its mechanics to the current context.

2016

Video Annotation for Immersive Journalism using Masking Techniques

Autores
Meira, J; Marques, J; Jacob, J; Nobrega, R; Rodrigues, R; Coelho, A; Augusto de Sousa, AA;

Publicação
2016 23RD PORTUGUESE MEETING ON COMPUTER GRAPHICS AND INTERACTION (EPCGI)

Abstract
This paper proposes an interactive annotation technique for 360 degrees videos that allows the use of traditional video editing techniques to add content to immersive videos. Using the case study of immersive journalism the main objective is to diminish the entry barrier for annotating 360 degrees video pieces, by providing a different annotation paradigm and a set of tools for annotation. The spread of virtual reality systems and immersive content has been growing substantially due to technological progress and cost reductions in equipment and software. From all the technologies employed in virtual reality systems, 360 degrees video is one that currently presents unique conditions to be widely used by various industries -especially for communication purposes. From the various areas that can benefit from the usage of virtual reality systems, the communication field is one that requires innovation in the way that narratives are built, especially in virtual reality systems. In the case of immersive journalism, 360 degrees video technology is currently one of the most used mediums by several media outlets. This kind of news content, whose innovative role should be highlighted, is still being studied in the field of journalism, needing a clearly defined set of rules and good practises. In order to improve the introduction of virtual elements in the 360 degrees videos this paper proposes a set of annotation paradigms for 1) Media information display and 2) Narrative and attention focusing. In this paper we present a list of possible techniques that solve the problem of immersive annotation, as well as a description of a prototype that was developed to test these concepts. The prototype implements an annotation technique based on masked videos and the extension of standard subtitle file formats. Finally a fast-track user study was developed to evaluate the acceptance of the visualisation techniques and to refine the set of tools.

2016

Visual-Inertial Based Autonomous Navigation

Autores
Martins, FD; Teixeira, LF; Nobrega, R;

Publicação
ROBOT 2015: SECOND IBERIAN ROBOTICS CONFERENCE: ADVANCES IN ROBOTICS, VOL 2

Abstract
This paper presents an autonomous navigation and position estimation framework which enables an Unmanned Aerial Vehicle (UAV) to possess the ability to safely navigate in indoor environments. This system uses both the on-board Inertial Measurement Unit (IMU) and the front camera of a AR. Drone platform and a laptop computer were all the data is processed. The system is composed of the following modules: navigation, door detection and position estimation. For the navigation part, the system relies on the detection of the vanishing point using the Hough transform for wall detection and avoidance. The door detection part relies not only on the detection of the contours but also on the recesses of each door using the latter as the main detector and the former as an additional validation for a higher precision. For the position estimation part, the system relies on pre-coded information of the floor in which the drone is navigating, and the velocity of the drone provided by its IMU. Several flight experiments show that the drone is able to safely navigate in corridors while detecting evident doors and estimate its position. The developed navigation and door detection methods are reliable and enable an UAV to fly without the need of human intervention.

2016

User Redirection and Direct Haptics in Virtual Environments

Autores
Carvalheiro, C; Nobrega, R; da Silva, H; Rodrigues, R;

Publicação
MM'16: PROCEEDINGS OF THE 2016 ACM MULTIMEDIA CONFERENCE

Abstract
This paper proposes a haptic interaction system for Virtual Reality (VR) based on a combination of tracking devices for hands and objects and a real-to-virtual mapping system for user redirection. In our solution the user receives haptic stimuli by manipulating real objects mapped to virtual objects. This solution departs from systems that rely on haptic devices (e.g., haptic gloves) as interfaces for the user to interact with objects in the Virtual Environment (VE). As such, the proposed solution makes use of direct haptics (touching) and redirection techniques to guide the user through the virtual environment. Using the mapping framework, when the user touches a virtual object in the VE, he will simultaneously be physically touching the equivalent real object. A relevant feature of the framework is the possibility to define a warped mapping between the real and virtual worlds, such that the relation between the user and the virtual space can be different from the one between the user and the real space. This is particularly useful when the application requires the emulation of large virtual spaces but the physical space available is more confined. To achieve this, both the user's hands and the objects are tracked. In the presented prototype we use a head-mounted depth sensor (i.e., Leap Motion) and a depth-sensing camera (i.e., Kinect). To assess the feasibility of this solution, a functional prototype and a room setup with core functionality were implemented. The test sessions with users evaluated the mapping accuracy, the user execution time and the awareness of the user regarding the warped space when performing tasks with redirection. The results gathered indicate that the solution can be used to provide direct haptic feedback in VR applications and for warping space perception within certain limits.

2013

Dynamic Insertion of Virtual Objects in Photographs

Autores
Nóbrega, R; Correia, N;

Publicação
IJCICG

Abstract

  • 1
  • 7