2011
Autores
Campos, MJ; Rodrigues, PP;
Publicação
HEALTHINF 2011: PROCEEDINGS OF THE INTERNATIONAL CONFERENCE ON HEALTH INFORMATICS
Abstract
Identity management represents an essential component for identification, authentication and authorization of patients, professionals, stakeholders and organizations in eHealth, combining information technologies and organizational procedures to provide security and privacy to health information. A literature search was conducted to identify relevant articles which were then grouped into themes according to the main subject. From the selected articles, plus their references, main findings, issues and future perspectives were systematized. A total of 31 articles were obtained, and after selection methodology 13 articles were included and grouped in four different themes: identity pseudonymisation and anonymization for secondary use, privacy preserving identity, identification, authentication and authorization identity in eHealth and identity and standardization. Through references cited in articles, research programs and working areas were also identified. Very few implementations could be found in literature, showing that this problem is even more complex than it seems and future adoption requires further research on new models and architectures. Furthermore, there is the need for a standard methodology for identity attributes interoperability between different stakeholders. Although there is a known large research effort in the context of identity in the information society in general, very few studies and experiences were found in the eHealth context.
2012
Autores
Bacelar Silva, GM; Rodrigues, PP;
Publicação
HEALTHINF 2012 - Proceedings of the International Conference on Health Informatics
Abstract
Health care systems around the world are under pressure, the costs are high and rising, and the population is growing and ageing. Health information technology is expected to help improving the health care processes capacity. The aim of this work is to analyze the benefits of the Theory of Constraints (TOC) buffer management implementation in the health care environment concerning the improvement in the patient flow and its management. A literature review was conducted, with an automated search on four databases to identify relevant published articles, written in English language between 2000 and 2010, about the TOC buffer management applied to the health care patient flow. Only three relevant articles were included. The analysis was based on the measurements of the implementations realized in seven different hospitals and for three different purposes: Accident & Emergency department (A&E), admissions and discharge. A statistical analysis conducted in the A&E and admissions post-implementation results demonstrated a significant improvement achieved. Four management control functions improvements were also obtained: prioritize, expedite, escalate and improve. Although few papers were available, TOC buffer management appears to be a good solution to improve performance and management in health care.
2011
Autores
Rodrigues, PP; Sebastiao, R; Santos, CC;
Publicação
CEUR Workshop Proceedings
Abstract
Cardiotocography is widely used, all over the world, for fetal heart rate and uterine contractions monitoring before (antepartum) and during (intrapartum) labor, regarding the detection of fetuses in danger of death or permanent damage. However, analysis of cardiotocogram tracings remains a large and unsolved issue. State-of-the-art monitoring systems provide quantitative parameters that are difficult to assess by the human eye. These systems also trigger alerts for changes in the behavior of the signals. However, they usually take up to 10 min to detect these different behaviors. Previous work using machine learning for concept drift detection has successfully achieved faster results in the detection of such events. Our aim is to extend the monitoring system with memory-less fading statistics, which have been successfully applied in drift detection and statistical tests, to improve detection of alarming events.
2011
Autores
Rodrigues, PP; Dias, C; Cruz Correia, R;
Publicação
CEUR Workshop Proceedings
Abstract
Clinical record integration and visualization is one of the most important abilities of modern health information systems (HIS). Its use on clinical encounters plays a relevant role in the efficacy and efficiency of healthcare. However, integrated HIS of central hospitals may gather millions of clinical reports (e.g. radiology, lab results, etc.). Hence, the clinical record must manage a stream of reports being produced in the entire hospital. Moreover, not all documents from a patient are relevant for a given encounter, and therefore not visualized during that encounter. Thus, the HIS must also manage a stream of events of visualization of reports, which runs in parallel to the stream of documents production. The aim of our project is to provide the physician with a recommendation of clinical reports to consider when they log in the computer. Our approach is to model relevance as the probability that a given document will be accessed in the current time frame. For that, we design a data stream management system to process the two streams, and Bayesian networks to learn those probabilities based on document, patient, department and user information. One of the biggest challenges to the learning problem, so far, is that no negative examples are produced by the stream (i.e. there are no record of documents not being visualized) leading to a one-class classification problem. The aim of this paper is to clearly present the setting and rationale for the approach. Current work is focused on both the stream processing mechanism and the Bayesian probability estimation.
2011
Autores
Pires, T; Rodrigues, P;
Publicação
BASIC & CLINICAL PHARMACOLOGY & TOXICOLOGY
Abstract
2009
Autores
Gama, J; Rodrigues, PP; Sebastião, R;
Publicação
Proceedings of the 2009 ACM Symposium on Applied Computing (SAC), Honolulu, Hawaii, USA, March 9-12, 2009
Abstract
Learning from data streams is a research area of increasing importance. Nowadays, several stream learning algorithms have been developed. Most of them learn decision models that continuously evolve over time, run in resource-aware environments, and detect and react to changes in the environment generating data. One important issue, not yet conveniently addressed, is the design of experimental work to evaluate and compare decision models that evolve over time. In this paper we propose a general framework for assessing the quality of streaming learning algorithms. We defend the use of Predictive Sequential error estimates over a sliding window to assess performance of learning algorithms that learn from open-ended data streams in non-stationary environments. This paper studies properties of convergence and methods to comparatively assess algorithms performance. Copyright 2009 ACM.
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.