Cookies Policy
We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out More
Close
  • Menu
About

About

Rui Nóbrega is a researcher at INESC TEC and an Invited Auxiliary Professor at University of Porto in the Faculty of Engineering (DEI-FEUP). He has a Computer Science Ph.D. degree (Doutoramento em Informática) and has been involved in several computer software projects in collaboration with private companies, public institutions and academia. He studied at the Faculty of Science and Technology in the NOVA University of Lisbon, obtaining degrees in Computer Science (Ph.D.) and Computer Software Engineering (M.Sc., B.Sc.). Before working at FEUP, he worked in the IT company Novabase as a programmer and in the research center CITI (NOVA). Throughout his research at CITI he has collaborated with several entities (Fundação Gulbenkian, LNEC, Protecção Civil, Fundação Museu Berardo, FCSH-UNL, Hospital de S. João) and private companies (Duvideo, Inovagency, BIN, NextWay, Atelier Joana Vasconcelos) by providing high-level consulting skills in multimedia, augmented reality, computer vision and interactive and mobile interfaces. Rui Nóbrega has published several scientific articles in international conferences and had several international experiences. These experiences include a mid-term research visit to the ICG lab at the Technical University of Graz, Austria. Finally he was also a teaching assistant at his university. In general he enjoys working with computer graphics, computer vision, augmented reality, multimedia, interfaces, intelligent algorithms and new interaction devices.

Areas of interest:Computer Graphics, HCI, Multimedia, Augmented Reality, Computer Vision

Interest
Topics
Details

Details

002
Publications

2018

Dynamic annotations on an interactive web-based 360° video player

Authors
Matos, T; Nóbrega, R; Rodrigues, R; Pinheiro, M;

Publication
Proceedings of the 23rd International ACM Conference on 3D Web Technology, Web3D 2018, Poznan, Poland, June 20-22, 2018

Abstract
The use of 360? videos has been increasing steadily in the 2010s, as content creators and users search for more immersive experiences. The freedom to choose where to look at during the video may hinder the overall experience instead of enhancing it, as there is no guarantee that the user will focus on relevant sections of the scene. Visual annotations superimposed on the video, such as text boxes or arrow icons, can help guide the user through the narrative of the video while maintaining freedom of movement. This paper presents a web-based immersive visualizer for 360? videos that contain dynamic media annotations, rendered in real-time. A set of annotations was created with the purpose of providing information or guiding the user to points of interest. The visualizer can be used with a computer, using a keyboard and mouse or HTC Vive, and in mobile devices with Cardboard VR headsets, to experience the video in virtual reality, which is made possible with the WebVR API. The visualizer was evaluated through usability tests, to analyze the impact of different annotation techniques on the users’ experience. The obtained results demonstrate that annotations can assist in guiding the user during the video, and a careful design is imperative so that they are not intrusive and distracting for the viewers. © 2018 Copyright held by the owner/author(s).

2018

Leveraging Pervasive Games for Tourism

Authors
Nóbrega, R; Jacob, J; Coelho, A; Ribeiro, J; Weber, J; Ferreira, S;

Publication
International Journal of Creative Interfaces and Computer Graphics

Abstract

2017

Adaptivity and safety in location-based games

Authors
Jacob, J; Nobrega, R; Coelho, A; Rodrigues, R;

Publication
9th International Conference on Virtual Worlds and Games for Serious Applications, VS-Games 2017, Athens, Greece, September 6-8, 2017

Abstract
Location-based games require, among other things, obtaining or computing information regarding the players' physical activity and real-world context. Additionally, ensuring that the players are assigned challenges that are adequate and safe for the current context (both physical and spatial) is also important, as it can improve both the gaming experience and the outcomes of the exercise. However, the impact adaptivity has in the specific case of location-based exergames still has not been researched in depth. In this paper, we present a location-based exergame capable of adapting its mechanics to the current context. © 2017 IEEE.

2017

Interactive 3D content insertion in images for multimedia applications

Authors
Nobrega, R; Correia, N;

Publication
MULTIMEDIA TOOLS AND APPLICATIONS

Abstract
This article addresses the problem of creating interactive mixed reality applications where virtual objects interact in images of real world scenarios. This is relevant to create games and architectural or space planning applications that interact with visual elements in the images such as walls, floors and empty spaces. These scenarios are intended to be captured by the users with regular cameras or using previously taken photographs. Introducing virtual objects in photographs presents several challenges, such as pose estimation and the creation of a visually correct interaction between virtual objects and the boundaries of the scene. The two main research questions addressed in this article include, the study of the feasibility of creating interactive augmented reality (AR) applications where virtual objects interact in a real world scenario using the image detected high-level features and, also, verifying if untrained users are capable and motivated enough to perform AR initialization steps. The proposed system detects the scene automatically from an image with additional features obtained using basic annotations from the user. This operation is significantly simple to accommodate the needs of non-expert users. The system analyzes one or more photos captured by the user and detects high-level features such as vanishing points, floor and scene orientation. Using these features it will be possible to create mixed and augmented reality applications where the user interactively introduces virtual objects that blend with the picture in real time and respond to the physical environment. To validate the solution several system tests are described and compared using available external image datasets.

2017

Player Adaptivity and Safety in Location-Based Games

Authors
Jacob, J; Lopes, A; Nóbrega, R; Rodrigues, R; Coelho, A;

Publication
Advances in Computer Entertainment Technology - Lecture Notes in Computer Science

Abstract