Detalhes
Nome
Tamas KaracsonyCargo
Assistente de InvestigaçãoDesde
01 maio 2019
Nacionalidade
HungriaContactos
+351222094000
tamas.karacsony@inesctec.pt
2024
Autores
Karácsony, T; Jeni, LA; de la Torre, F; Cunha, JPS;
Publicação
IMAGE AND VISION COMPUTING
Abstract
Many clinical applications involve in-bed patient activity monitoring, from intensive care and neuro-critical infirmary, to semiology-based epileptic seizure diagnosis support or sleep monitoring at home, which require accurate recognition of in-bed movement actions from video streams. The major challenges of clinical application arise from the domain gap between common in-the-lab and clinical scenery (e.g. viewpoint, occlusions, out-of-domain actions), the requirement of minimally intrusive monitoring to already existing clinical practices (e.g. non-contact monitoring), and the significantly limited amount of labeled clinical action data available. Focusing on one of the most demanding in-bed clinical scenarios - semiology-based epileptic seizure classification - this review explores the challenges of video-based clinical in-bed monitoring, reviews video-based action recognition trends, monocular 3D MoCap, and semiology-based automated seizure classification approaches. Moreover, provides a guideline to take full advantage of transfer learning for in-bed action recognition for quantified, evidence-based clinical diagnosis support. The review suggests that an approach based on 3D MoCap and skeleton-based action recognition, strongly relying on transfer learning, could be advantageous for these clinical in-bed action recognition problems. However, these still face several challenges, such as spatio-temporal stability, occlusion handling, and robustness before realizing the full potential of this technology for routine clinical usage.
2024
Autores
Lopes, EM; Pimentel, M; Karácsony, T; Rego, R; Cunha, JPS;
Publicação
2024 IEEE 22ND MEDITERRANEAN ELECTROTECHNICAL CONFERENCE, MELECON 2024
Abstract
The Deep Brain Stimulation of the Anterior Nucleus of the Thalamus (ANT-DBS) is an effective treatment for refractory epilepsy. In order to assess the involvement of the ANT during voluntary hand repetitive movements similar to some seizure-induced ones, we simultaneously collected videoelectroencephalogram ( vEEG) and ANT-Local Field Potential (LFPs) signals from two epilepsy patients implanted with the PerceptTM PC neurostimulator, who stayed at an Epilepsy Monitoring Unit (EMU) for a 5 day period. For this purpose, a repetitive voluntary movement execution protocol was designed and an event-related desynchronisation/synchronisation (ERD/ERS) analysis was performed. We found a power increase in alpha and theta frequency bands during movement execution for both patients. The same pattern was not found when patients were at rest. Furthermore, a similar increase of relative power was found in LFPs from other neighboring basal ganglia. This suggests that the ERS pattern may be associated to upper limb automatisms, indicating that the ANT and other basal ganglia may be involved in the execution of these repetitive movements. These findings may open a new window for the study of seizure-induced movements (semiology) as biomarkers of the beginning of seizures, which can be helpful for the future of adaptive DBS techniques for better control of epileptic seizures of these patients.
2024
Autores
Karácsony T.; Fearns N.; Vollmar C.; Birk D.; Rémi J.; Noachtar S.; Silva Cunha J.P.;
Publicação
Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology Society, EMBS
Abstract
Epileptic seizures are clearly characterized by their displayed behavior, the semiology, which is used in diagnosis and classification as a base for therapy. This article presents a novel 4K 3D video recording and reviewing system for epilepsy monitoring, introducing a novel perspective and allowing continuous recording and review of 3D videos in the epilepsy monitoring unit (EMU), providing significantly more detail than the current clinical systems, which can lead to the recognition of more Movements of Interest (MOIs) and may reduce inter-rater variability. To put the system to an initial test in clinical practice the article presents three real-world examples of subtle MOIs, that could only be appreciated on the 4K-video, but not on the VGA-video, recorded as part of the clinical routine. In conclusion, a 4K-RGB recording, 3D cropping, and 3D video playing system was developed, implemented, and tested for realworld clinical scenarios, considering the specific requirements of clinical monitoring in EMUs. The new data acquisition setup can support clinical diagnosis, which may lead to new insights in the field of epilepsy and the development of AI approaches in the future.
2023
Autores
Carmona, J; Karacsony, T; Cunha, JPS;
Publicação
2023 IEEE 7TH PORTUGUESE MEETING ON BIOENGINEERING, ENBENG
Abstract
Clinical in-bed video-based human motion analysis is a very relevant computer vision topic for several relevant biomedical applications. Nevertheless, the main public large datasets (e.g. ImageNet or 3DPW) used for deep learning approaches lack annotated examples for these clinical scenarios. To address this issue, we introduce BlanketSet, an RGB-IRD action recognition dataset of sequences performed in a hospital bed. This dataset has the potential to help bridge the improvements attained in more general large datasets to these clinical scenarios. Information on how to access the dataset is available at rdm.inesctec.pt/dataset/nis-2022-004.
2023
Autores
Carmona, J; Karacsony, T; Cunha, JPS;
Publicação
2023 IEEE 7TH PORTUGUESE MEETING ON BIOENGINEERING, ENBENG
Abstract
Human motion analysis has seen drastic improvements recently, however, due to the lack of representative datasets, for clinical in-bed scenarios it is still lagging behind. To address this issue, we implemented BlanketGen, a pipeline that augments videos with synthetic blanket occlusions. With this pipeline, we generated an augmented version of the pose estimation dataset 3DPW called BlanketGen3DPW. We then used this new dataset to fine-tune a Deep Learning model to improve its performance in these scenarios with promising results. Code and further information are available at https://gitlab.inesctec.pt/brain-lab/brainlab-public/blanket-gen-releases.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.