2022
Authors
Paulino, N;
Publication
Abstract
2022
Authors
Paulino, N;
Publication
Abstract
2022
Authors
Aly, L; Bota, P; Godinho, L; Bernardes, G; Silva, H;
Publication
IMX 2022 - Proceedings of the 2022 ACM International Conference on Interactive Media Experiences
Abstract
Professional theatre actors are highly specialized in controlling their own expressive behaviour and non-verbal emotional expressiveness, so they are of particular interest in fields of study such as affective computing. We present Acting Emotions, an experimental protocol to investigate the physiological correlates of emotional valence and arousal within professional theatre actors. Ultimately, our protocol examines the physiological agreement of valence and arousal amongst several actors. Our main contribution lies in the open selection of the emotional set by the participants, based on a set of four categorical emotions, which are self-assessed at the end of each experiment. The experiment protocol was validated by analyzing the inter-rater agreement (> 0.261 arousal, > 0.560 valence), the continuous annotation trajectories, and comparing the box plots for different emotion categories. Results show that the participants successfully induced the expected emotion set to a significant statistical level of distinct valence and arousal distributions. © 2022 Owner/Author.
2022
Authors
Clement, A; Bernardes, G;
Publication
MULTIMODAL TECHNOLOGIES AND INTERACTION
Abstract
Digital musical instruments have become increasingly prevalent in musical creation and production. Optimizing their usability and, particularly, their expressiveness, has become essential to their study and practice. The absence of multimodal feedback, present in traditional acoustic instruments, has been identified as an obstacle to complete performer-instrument interaction in particular due to the lack of embodied control. Mobile-based digital musical instruments present a particular case by natively providing the possibility of enriching basic auditory feedback with additional multimodal feedback. In the experiment presented in this article, we focused on using visual and haptic feedback to support and enrich auditory content to evaluate the impact on basic musical tasks (i.e., note pitch tuning accuracy and time). The experiment implemented a protocol based on presenting several musical note examples to participants and asking them to reproduce them, with their performance being compared between different multimodal feedback combinations. Collected results show that additional visual feedback was found to reduce user hesitation in pitch tuning, allowing users to reach the proximity of desired notes in less time. Nonetheless, neither visual nor haptic feedback was found to significantly impact pitch tuning time and accuracy compared to auditory-only feedback.
2022
Authors
Forero, J; Bernardes, G; Mendes, M;
Publication
MM 2022 - Proceedings of the 30th ACM International Conference on Multimedia
Abstract
Emotional Machines is an interactive installation that builds affective virtual environments through spoken language. In response to the existing limitations of emotion recognition models incorporating computer vision and electrophysiological activity, whose sources are hindered by a head-mounted display, we propose the adoption of speech emotion recognition (from the audio signal) and semantic sentiment analysis. In detail, we use two machine learning models to predict three main emotional categories from high-level semantic and low-level speech features. Output emotions are mapped to audiovisual representation by an end-To-end process. We use a generative model of chord progressions to transfer speech emotion into music and a synthesized image from the text (transcribed from the user's speech). The generated image is used as the style source in the style-Transfer process onto an equirectangular projection image target selected for each emotional category. The installation is an immersive virtual space encapsulating emotions in spheres disposed into a 3D environment. Thus, users can create new affective representations or interact with other previous encoded instances using joysticks. © 2022 Owner/Author.
2022
Authors
Bernardo, G; Bernardes, G;
Publication
Personal and Ubiquitous Computing
Abstract
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.