Matos, T; Nobrega, R; Rodrigues, R; Pinheiro, M;
WEB3D 2018: THE 23RD INTERNATIONAL ACM CONFERENCE ON 3D WEB TECHNOLOGY
The use of 360? videos has been increasing steadily in the 2010s, as content creators and users search for more immersive experiences. The freedom to choose where to look at during the video may hinder the overall experience instead of enhancing it, as there is no guarantee that the user will focus on relevant sections of the scene. Visual annotations superimposed on the video, such as text boxes or arrow icons, can help guide the user through the narrative of the video while maintaining freedom of movement. This paper presents a web-based immersive visualizer for 360? videos that contain dynamic media annotations, rendered in real-time. A set of annotations was created with the purpose of providing information or guiding the user to points of interest. The visualizer can be used with a computer, using a keyboard and mouse or HTC Vive, and in mobile devices with Cardboard VR headsets, to experience the video in virtual reality, which is made possible with the WebVR API. The visualizer was evaluated through usability tests, to analyze the impact of different annotation techniques on the users’ experience. The obtained results demonstrate that annotations can assist in guiding the user during the video, and a careful design is imperative so that they are not intrusive and distracting for the viewers. © 2018 Copyright held by the owner/author(s).
Pereira, V; Matos, T; Rodrigues, R; Nobrega, R; Jacob, J;
PROCEEDINGS OF THE 2019 INTERNATIONAL CONFERENCE ON GRAPHICS AND INTERACTION (ICGI 2019)
This paper proposes the implementation of a framework for the development of collaborative extended reality (XR) applications. Using the framework, developers can focus on understanding which collaborative mechanisms they need to implement for the respective reality model application. In this paper we specifically study collaborative mechanisms around object manipulation in Virtual Reality (VR). As such, we planned a VR prototype using the proposed framework, which was used to validate the various interaction and collaboration features in VR. The gathered data from the user tests revealed that they enjoyed the experience and the collaborative mechanisms helped them work together. Furthermore, to understand whether the framework allowed for the development of XR applications, we decided to implement an augmented reality prototype as well. Afterwards, we ran an experiment with 4 VR and 3 AR users sharing the same virtual environment. The experiment was successful at allowing them to interact in real-time in the same shared environment. Therefore, the framework enables the development of XR applications that support different mixed-reality technologies.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.