Cookies
Usamos cookies para melhorar nosso site e a sua experiência. Ao continuar a navegar no site, você aceita a nossa política de cookies. Ver mais
Aceitar Rejeitar
  • Menu
Publicações

Publicações por Hugo Miguel Choupina

2017

ICTAL VOCALIZATION IN FOCAL EPILEPSY

Autores
Hartl, E; Knoche, T; Remi, J; Choupina, H; Cunha, J; Noachtar, S;

Publicação
EPILEPSIA

Abstract

2018

NeuroKinect 3.0: Multi-bed 3Dvideo-EEG system for epilepsy clinical motion monitoring

Autores
Choupina, HMP; Rocha, AP; Fernandes, JM; Vollmar, C; Noachtar, S; Cunha, JPS;

Publicação
Studies in Health Technology and Informatics

Abstract
Epilepsy diagnosis is typically performed through 2Dvideo-EEG monitoring, relying on the viewer's subjective interpretation of the patient's movements of interest. Several attempts at quantifying seizure movements have been performed in the past using 2D marker-based approaches, which have several drawbacks for the clinical routine (e.g. occlusions, lack of precision, and discomfort for the patient). These drawbacks are overcome with a 3D markerless approach. Recently, we published the development of a single-bed 3Dvideo-EEG system using a single RGB-D camera (Kinect v1). In this contribution, we describe how we expanded the previous single-bed system to a multi-bed departmental one that has been managing 6.61 Terabytes per day since March 2016. Our unique dataset collected so far includes 2.13 Terabytes of multimedia data, corresponding to 278 3Dvideo-EEG seizures from 111 patients. To the best of the authors' knowledge, this system is unique and has the potential of being spread to multiple EMUs around the world for the benefit of a greater number of patients. © 2018 European Federation for Medical Informatics (EFMI) and IOS Press.

2018

Quantitative and qualitative analysis of ictal vocalization in focal epilepsy syndromes

Autores
Hartl, E; Knoche, T; Choupina, HMP; Remi, J; Vollmar, C; Cunha, JPS; Noachtar, S;

Publicação
Seizure

Abstract
Purpose: To investigate the frequency, localizing significance, and intensity characteristics of ictal vocalization in different focal epilepsy syndromes. Methods: Up to four consecutive focal seizures were evaluated in 277 patients with lesional focal epilepsy, excluding isolated auras and subclinical EEG seizure patterns. Vocalization was considered to be present if observed in at least one of the analyzed seizures and not being of speech quality. Intensity features of ictal vocalization were analyzed in a subsample of 17 patients with temporal and 19 with extratemporal epilepsy syndrome. Results: Ictal vocalization was observed in 37% of the patients (102/277) with similar frequency amongst different focal epilepsy syndromes. Localizing significance was found for its co-occurrence with ictal automatisms, which identified patients with temporal seizure onset with a sensitivity of 92% and specificity of 70%. Quantitative analysis of vocalization intensity allowed to distinguish seizures of frontal from temporal lobe origin based on the intensity range (p = 0.0003), intensity variation (p < 0.0001), as well as the intensity increase rate at the beginning of the vocalization (p = 0.003), which were significantly higher in frontal lobe seizures. No significant difference was found for mean intensity and mean vocalization duration. Conclusions: Although ictal vocalization is similarly common in different focal epilepsies, it shows localizing significance when taken into account the co-occurring seizure semiology. It especially increases the localizing value of automatisms, predicting a temporal seizure onset with a sensitivity of 92% and specificity of 70%. Quantitative parameters of the intensity dynamic objectively distinguished frontal lobe seizures, establishing an observer independent tool for semiological seizure evaluation. © 2018 British Epilepsy Association

2018

System for automatic gait analysis based on a single RGB-D camera

Autores
Rocha, AP; Pereira Choupina, HMP; Vilas Boas, MD; Fernandes, JM; Silva Cunha, JPS;

Publicação
PLOS ONE

Abstract
Human gait analysis provides valuable information regarding the way of walking of a given subject. Low-cost RGB-D cameras, such as the Microsoft Kinect, are able to estimate the 3-D position of several body joints without requiring the use of markers. This 3-D information can be used to perform objective gait analysis in an affordable. portable, and non-intrusive way. In this contribution, we present a system for fully automatic gait analysis using a single RGB-D camera, namely the second version of the Kinect. Our system does not require any manual intervention (except for starting/stopping the data acquisition), since it firstly recognizes whether the subject is walking or not, and identifies the different gait cycles only when walking is detected. For each gait cycle, it then computes several gait parameters, which can provide useful information in various contexts, such as sports, healthcare, and biometric identification. The activity recognition is performed by a predictive model that distinguishes between three activities (walking, standing and marching), and between two postures of the subject (facing the sensor, and facing away from it). The model was built using a multilayer perceptron algorithm and several measures extracted from 3-D joint data, achieving an overall accuracy and F-1 score of 98%. For gait cycle detection, we implemented an algorithm that estimates the instants corresponding to left and right heel strikes, relying on the distance between ankles, and the velocity of left and right ankles. The algorithm achieved errors for heel strike instant and stride duration estimation of 15 +/- 25 ms and 1 +/- 29 ms (walking towards the sensor), and 12 +/- 23 ms and 2 +/- 24 ms (walking away from the sensor ) Our gait cycle detection solution can be used with any other RGB-D camera that provides the 3-D position of the main body joints.

2018

On the Fly Reporting of Human Body Movement based on Kinect v2

Autores
Rodrigues, J; Maia, P; Choupina, HMP; Cunha, JPS;

Publicação
Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS

Abstract
Human gait analysis is of utmost importance in understanding several aspects of human movement. In clinical practice, characterizing movement in order to obtain accurate and reliable information is a major challenge, and physicians usually rely on direct observation in order to evaluate a patient's motor abilities. In this contribution, a system that can objectively analyze the patients gait and generate an on the fly, targeted and optimized gait analysis report is presented. It is an extension to an existing system that could be used without interfering with the healthcare environment, which did not provide any on the fly feedback to physicians. Patient data are acquired using Kinect v2, followed by data processing, gait specific feature extraction, ending with the generation of a quantitative on the fly report. To the best of our knowledge, the complete system fills the gap as a proper gait analysis system, i.e., a low-cost tool that can be applied without interfering with the healthcare environment, provide quantitative gait information and on the fly feedback to physicians through a motion quantification report that can be useful in multiple areas. © 2018 IEEE.

  • 2
  • 2