Cookies Policy
We use cookies to improve our site and your experience. By continuing to browse our site you accept our cookie policy. Find out More
Close
  • Menu
About

About

Hugo Choupina was born in 1991.

Hugo holds a Bachelor degree in Bioengineering – Biomedical Engineering (Catholic University – 2012) and a Master degree in Biomedical Engineering (FEUP – 2014).

Currently works as a Biomedical Engineer at the Epilepsy center of Klinikum GroBhardern LMU Hospital, Munchen, Germany and as a Researcher at BRAIN@INESC TEC, Porto. Co-author of several scientific papers.

Hugo is a BRAIN (Biomedical Research And INnovation) researcher since 2013. 
Co-author of the first 3DvideoEEG routine system developed in the world.

Hugo has a strong passion for the Healthcare Industry. 
Focused in technology development, optimization and usage by Healthcare profissionals.

Top publication: Cunha JPS, Choupina HMP, Rocha AP, Fernandes JM, Achilles F, Loesch AM, et al. (2016) NeuroKinect: A Novel Low-Cost 3Dvideo-EEG System for Epileptic Seizure Motion Quantification. PLoS ONE 11(1): e0145669. doi:10.1371/journal.pone.0145669 [h5-index:161-#4 Life Sciences&Earth Sciences; ISI impact factor: 3.234]

Interest
Topics
Details

Details

001
Publications

2018

NeuroKinect 3.0: Multi-bed 3Dvideo-EEG system for epilepsy clinical motion monitoring

Authors
Choupina, HMP; Rocha, AP; Fernandes, JM; Vollmar, C; Noachtar, S; Cunha, JPS;

Publication
Studies in Health Technology and Informatics

Abstract
Epilepsy diagnosis is typically performed through 2Dvideo-EEG monitoring, relying on the viewer's subjective interpretation of the patient's movements of interest. Several attempts at quantifying seizure movements have been performed in the past using 2D marker-based approaches, which have several drawbacks for the clinical routine (e.g. occlusions, lack of precision, and discomfort for the patient). These drawbacks are overcome with a 3D markerless approach. Recently, we published the development of a single-bed 3Dvideo-EEG system using a single RGB-D camera (Kinect v1). In this contribution, we describe how we expanded the previous single-bed system to a multi-bed departmental one that has been managing 6.61 Terabytes per day since March 2016. Our unique dataset collected so far includes 2.13 Terabytes of multimedia data, corresponding to 278 3Dvideo-EEG seizures from 111 patients. To the best of the authors' knowledge, this system is unique and has the potential of being spread to multiple EMUs around the world for the benefit of a greater number of patients. © 2018 European Federation for Medical Informatics (EFMI) and IOS Press.

2018

Quantitative and qualitative analysis of ictal vocalization in focal epilepsy syndromes

Authors
Hartl, E; Knoche, T; Choupina, HMP; Remi, J; Vollmar, C; Cunha, JPS; Noachtar, S;

Publication
Seizure

Abstract
Purpose: To investigate the frequency, localizing significance, and intensity characteristics of ictal vocalization in different focal epilepsy syndromes. Methods: Up to four consecutive focal seizures were evaluated in 277 patients with lesional focal epilepsy, excluding isolated auras and subclinical EEG seizure patterns. Vocalization was considered to be present if observed in at least one of the analyzed seizures and not being of speech quality. Intensity features of ictal vocalization were analyzed in a subsample of 17 patients with temporal and 19 with extratemporal epilepsy syndrome. Results: Ictal vocalization was observed in 37% of the patients (102/277) with similar frequency amongst different focal epilepsy syndromes. Localizing significance was found for its co-occurrence with ictal automatisms, which identified patients with temporal seizure onset with a sensitivity of 92% and specificity of 70%. Quantitative analysis of vocalization intensity allowed to distinguish seizures of frontal from temporal lobe origin based on the intensity range (p = 0.0003), intensity variation (p < 0.0001), as well as the intensity increase rate at the beginning of the vocalization (p = 0.003), which were significantly higher in frontal lobe seizures. No significant difference was found for mean intensity and mean vocalization duration. Conclusions: Although ictal vocalization is similarly common in different focal epilepsies, it shows localizing significance when taken into account the co-occurring seizure semiology. It especially increases the localizing value of automatisms, predicting a temporal seizure onset with a sensitivity of 92% and specificity of 70%. Quantitative parameters of the intensity dynamic objectively distinguished frontal lobe seizures, establishing an observer independent tool for semiological seizure evaluation. © 2018 British Epilepsy Association

2018

System for automatic gait analysis based on a single RGB-D camera

Authors
Rocha, AP; Pereira Choupina, HMP; Vilas Boas, MD; Fernandes, JM; Silva Cunha, JPS;

Publication
PLOS ONE

Abstract
Human gait analysis provides valuable information regarding the way of walking of a given subject. Low-cost RGB-D cameras, such as the Microsoft Kinect, are able to estimate the 3-D position of several body joints without requiring the use of markers. This 3-D information can be used to perform objective gait analysis in an affordable. portable, and non-intrusive way. In this contribution, we present a system for fully automatic gait analysis using a single RGB-D camera, namely the second version of the Kinect. Our system does not require any manual intervention (except for starting/stopping the data acquisition), since it firstly recognizes whether the subject is walking or not, and identifies the different gait cycles only when walking is detected. For each gait cycle, it then computes several gait parameters, which can provide useful information in various contexts, such as sports, healthcare, and biometric identification. The activity recognition is performed by a predictive model that distinguishes between three activities (walking, standing and marching), and between two postures of the subject (facing the sensor, and facing away from it). The model was built using a multilayer perceptron algorithm and several measures extracted from 3-D joint data, achieving an overall accuracy and F-1 score of 98%. For gait cycle detection, we implemented an algorithm that estimates the instants corresponding to left and right heel strikes, relying on the distance between ankles, and the velocity of left and right ankles. The algorithm achieved errors for heel strike instant and stride duration estimation of 15 +/- 25 ms and 1 +/- 29 ms (walking towards the sensor), and 12 +/- 23 ms and 2 +/- 24 ms (walking away from the sensor ) Our gait cycle detection solution can be used with any other RGB-D camera that provides the 3-D position of the main body joints.

2018

On the Fly Reporting of Human Body Movement based on Kinect v2

Authors
Rodrigues, J; Maia, P; Choupina, HMP; Cunha, JPS;

Publication
Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS

Abstract
Human gait analysis is of utmost importance in understanding several aspects of human movement. In clinical practice, characterizing movement in order to obtain accurate and reliable information is a major challenge, and physicians usually rely on direct observation in order to evaluate a patient's motor abilities. In this contribution, a system that can objectively analyze the patients gait and generate an on the fly, targeted and optimized gait analysis report is presented. It is an extension to an existing system that could be used without interfering with the healthcare environment, which did not provide any on the fly feedback to physicians. Patient data are acquired using Kinect v2, followed by data processing, gait specific feature extraction, ending with the generation of a quantitative on the fly report. To the best of our knowledge, the complete system fills the gap as a proper gait analysis system, i.e., a low-cost tool that can be applied without interfering with the healthcare environment, provide quantitative gait information and on the fly feedback to physicians through a motion quantification report that can be useful in multiple areas. © 2018 IEEE.

2017

The first Transthyretin Familial Amyloid Polyneuropathy gait quantification study - Preliminary results

Authors
Vilas Boas, MD; Rocha, AP; Pereira Choupina, HMP; Fernandes, JM; Coelho, T; Silva Cunha, JPS;

Publication
Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS

Abstract
Transthyretin Familial Amyloid Polyneuropathy (TTR-FAP) is a rare neurological disease caused by a genetic mutation with a variable presentation and consequent challenging diagnosis, complex follow-up and treatment. At this moment, this condition has no cure and treatment options are under development. One of the disease's implications is a definite and progressive motor impairment that from the early stages compromises walking ability and daily life activities. The detection of this impairment is key for the disease onset diagnosis. With the goal of improving diagnosis of the symptoms and patients' quality of life, the authors have assessed the gait characteristics of subjects suffering from this condition. This contribution shows the results of a preliminary study, using a non-intrusive, markerless vision-based gait analysis tool. To the best of our knowledge, the reported results constitute the first gait analysis data of TTR-FAP mutation carriers. © 2017 IEEE.