Cookies
O website necessita de alguns cookies e outros recursos semelhantes para funcionar. Caso o permita, o INESC TEC irá utilizar cookies para recolher dados sobre as suas visitas, contribuindo, assim, para estatísticas agregadas que permitem melhorar o nosso serviço. Ver mais
Aceitar Rejeitar
  • Menu
Publicações

Publicações por LIAAD

2005

Partition incremental discretization

Autores
Pinto, C; Gama, J;

Publicação
2005 Portuguese Conference on Artificial Intelligence, Proceedings

Abstract
In this paper we propose a new method to perform incremental discretization. This approach consists in splitting the task in two layers. The first layer receives the sequence of input data and stores statistics of this data, using a higher number of intervals than what is usually required. The final discretization is generated by the second layer, based on the statistics stored by the previous layer. The proposed architecture processes streaming examples in a single scan, in constant time and space even for infinite sequences of examples. We demonstrate with examples that incremental discretization achieves better results than batch discretization, maintaining the performance of learning algorithms. The proposed method is much more appropriate to evaluate incremental algorithms, and in problems where data flows continuously as most of recent data mining applications.

2005

EKDB&W'05: Workshop on extraction of knowledge from databases and warehouses

Autores
Gama, J; Pires, JM; Cardoso, M; Marques, NC; Cavique, L;

Publicação
2005 Portuguese Conference on Artificial Intelligence, Proceedings

Abstract

2005

Bias management of Bayesian network classifiers

Autores
Castillo, G; Gama, J;

Publicação
DISCOVERY SCIENCE, PROCEEDINGS

Abstract
The purpose of this paper is to describe an adaptive algorithm for improving the performance of Bayesian Network Classifiers (BNCs) in an on-line learning framework. Instead of choosing a priori a particular model class of BNCs, our adaptive algorithm scales up the model's complexity by gradually increasing the number of allowable dependencies among features, Starting with the simple Naive Bayes structure, it uses simple decision rules based on qualitative information about the performance's dynamics to decide when it makes sense to do the next move in the spectrum of feature dependencies and to start searching for a more complex classifier. Results in conducted experiments using the class of Dependence Bayesian Classifiers on three large datasets show that our algorithm is able to select a model with the appropriate complexity for the current amount of training data, thus balancing the computational cost of updating a model with the benefits of increasing in accuracy.

2005

Introduction

Autores
Gama, J; Pires, JM; Cardoso, M; Marques, NC; Cavique, L;

Publicação
Progress in Artificial Intelligence, 12th Portuguese Conference on Artificial Intelligence, EPIA 2005, Covilhã, Portugal, December 5-8, 2005, Proceedings

Abstract

2005

Lecture Notes in Artificial Intelligence: Introduction

Autores
Gama, J; Moura Pires, J; Cardoso, M; Marques, NC; Cavique, L;

Publicação
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Abstract

2005

Extracting knowledge from databases and warehouses (EKDB&W 2005) - Introduction

Autores
Gama, J; Moura Pires, J; Cardoso, M; Marques, NC; Cavique, L;

Publicação
PROGRESS IN ARTIFICIAL INTELLIGENCE, PROCEEDINGS

Abstract

  • 482
  • 514