Cookies Policy
The website need some cookies and similar means to function. If you permit us, we will use those means to collect data on your visits for aggregated statistics to improve our service. Find out More
Accept Reject
  • Menu
Publications

Publications by Raquel Sebastião

2014

Comparing Data Distribution Using Fading Histograms

Authors
Sebastiao, R; Gama, J; Mendonca, T;

Publication
21ST EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE (ECAI 2014)

Abstract
The emergence of real temporal applications under non-stationary scenarios has drastically altered the ability to generate and gather information. Nowadays, under dynamic scenarios, potentially unbounded and massive amounts of information are generated at high-speed rate, known as data streams. Dealing with evolving data streams imposes the online monitoring of data in order to detect changes. The contribution of this paper is to present the advantage of using fading histograms to compare data distribution for change detection purposes. In an windowing scheme, data distributions provided by the fading histograms are compared using the Kullback-Leibler divergence. The experimental results support that the detection delay time is smaller when using fading histograms to represent data instead of standard histograms.

2014

Constructing fading histograms from data streams

Authors
Sebastião, R; Gama, J; Mendonça, T;

Publication
Progress in AI

Abstract
The ability to collect data is changing drastically. Nowadays, data are gathered in the form of transient and finite data streams. Memory restrictions preclude keeping all received data in memory. When dealing with massive data streams, it is mandatory to create compact representations of data, also known as synopses structures or summaries. Reducing memory occupancy is of utmost importance when handling a huge amount of data. This paper addresses the problem of constructing histograms from data streams under error constraints. When constructing online histograms from data streams there are two main characteristics to embrace: the updating facility and the error of the histogram. Moreover, in dynamic environments, besides the need of compact summaries to capture the most important properties of data, it is also essential to forget old data. Therefore, this paper presents sliding histograms and fading histograms, an abrupt and a smooth strategies to forget outdated data. © 2014 Springer-Verlag Berlin Heidelberg.

2013

On evaluating stream learning algorithms

Authors
Gama, J; Sebastiao, R; Rodrigues, PP;

Publication
MACHINE LEARNING

Abstract
Most streaming decision models evolve continuously over time, run in resource-aware environments, and detect and react to changes in the environment generating data. One important issue, not yet convincingly addressed, is the design of experimental work to evaluate and compare decision models that evolve over time. This paper proposes a general framework for assessing predictive stream learning algorithms. We defend the use of prequential error with forgetting mechanisms to provide reliable error estimators. We prove that, in stationary data and for consistent learning algorithms, the holdout estimator, the prequential error and the prequential error estimated over a sliding window or using fading factors, all converge to the Bayes error. The use of prequential error with forgetting mechanisms reveals to be advantageous in assessing performance and in comparing stream learning algorithms. It is also worthwhile to use the proposed methods for hypothesis testing and for change detection. In a set of experiments in drift scenarios, we evaluate the ability of a standard change detection algorithm to detect change using three prequential error estimators. These experiments point out that the use of forgetting mechanisms (sliding windows or fading factors) are required for fast and efficient change detection. In comparison to sliding windows, fading factors are faster and memoryless, both important requirements for streaming applications. Overall, this paper is a contribution to a discussion on best practice for performance assessment when learning is a continuous process, and the decision models are dynamic and evolve over time.

2013

Real-time algorithm for changes detection in depth of anesthesia signals

Authors
Sebastiao, R; Silva, MM; Rabico, R; Gama, J; Mendonca, T;

Publication
Evolving Systems

Abstract
This paper presents a real-time algorithm for changes detection in depth of anesthesia signals. A Page-Hinkley test (PHT) with a forgetting mechanism (PHT-FM) was developed. The samples are weighted according to their "age" so that more importance is given to recent samples. This enables the detection of the changes with less time delay than if no forgetting factor was used. The performance of the PHT-FM was evaluated in a two-fold approach. First, the algorithm was run offline in depth of anesthesia (DoA) signals previously collected during general anesthesia, allowing the adjustment of the forgetting mechanism. Second, the PHT-FM was embedded in a real-time software and its performance was validated online in the surgery room. This was performed by asking the clinician to classify in real-time the changes as true positives, false positives or false negatives. The results show that 69 % of the changes were classified as true positives, 26 % as false positives, and 5 % as false negatives. The true positives were also synchronized with changes in the hypnotic or analgesic rates made by the clinician. The contribution of this work has a high impact in the clinical practice since the PHT-FM alerts the clinician for changes in the anesthetic state of the patient, allowing a more prompt action. The results encourage the inclusion of the proposed PHT-FM in a real-time decision support system for routine use in the clinical practice. © 2012 Springer-Verlag.

2017

Fading histograms in detecting distribution and concept changes

Authors
Sebastião, R; Gama, J; Mendonça, T;

Publication
I. J. Data Science and Analytics

Abstract

2017

Supporting the page-hinkley test with empirical mode decomposition for change detection

Authors
Sebastião, R; Fernandes, JM;

Publication
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Abstract
In the dynamic scenarios faced nowadays, when handling non stationary data streams it is of utmost importance to perform change detection tests. In this work, we propose the Intrinsic Page Hinkley Test (iPHT), which enhances the Page Hinkley Test (PHT) eliminating the user-defined parameter (the allowed magnitude of change of the data that are not considered real distribution change of the data stream) by using the second order intrinsic mode function (IMF) which is a data dependent value reflecting the intrinsic data variation. In such way, the PHT change detection method is expected to be more robust and require less tunes. Furthermore, we extend the proposed iPHT to a blockwise approach. Computing the IMF over sliding windows, which is shown to be more responsive to changes and suitable for online settings. The iPHT is evaluated using artificial and real data, outperforming the PHT. © Springer International Publishing AG 2017.

  • 1
  • 5