1995
Authors
Gama, J; Brazdil, P;
Publication
Progress in Artificial Intelligence, 7th Portuguese Conference on Artificial Intelligence, EPIA '95, Funchal, Madeira Island, Portugal, October 3-6, 1995, Proceedings
Abstract
This paper is concerned with the problem of characterization of classification algorithms. The aim is to determine under what circumstances a particular classification algorithm is applicable. The method used involves generation of different kinds of models. These include regression and rule models, piecewise linear models (model trees) and instance based models. These are generated automatically on the basis of dataset characteristics and given test results. The lack of data is compensated for by various types of preprocessing. The models obtained are characterized by quantifying their predictive capability and the best models are identified. © Springer-Verlag Berlin Heidelberg 1995.
1997
Authors
Torgo, L; Gama, J;
Publication
MACHINE LEARNING : ECML-97
Abstract
We present a methodology that enables the use of classification algorithms on regression tasks. We implement this method in system RECLA that transforms a regression problem into a classification one and then uses an existent classification system to solve this new problem. The transformation consists of mapping a continuous variable into an ordinal variable by grouping its values into an appropriate set of intervals. We use misclassification costs as a means to reflect the implicit ordering among the ordinal values of the new variable. We describe a set of alternative discretization methods and, based on our experimental results, justify the need for a search-based approach to choose the best method. Our experimental results confirm the validity of our search-based approach to class discretization, and reveal the accuracy benefits of adding misclassification costs.
2009
Authors
Huang, R; Yang, Q; Pei, J; Gama, J; Meng, X; Li, X;
Publication
Lecture Notes in Computer Science
Abstract
2010
Authors
Gaber, MM; Vatsavai, RR; Omitaomu, OA; Gama, J; Chawla, NV; Ganguly, AR;
Publication
Lecture Notes in Computer Science
Abstract
2001
Authors
Gama, J;
Publication
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Abstract
The design of algorithms that explore multiple representation languages and explore different search spaces has an intuitive appeal. In the context of classification problems, algorithms that generate multivariate trees are able to explore multiple representation languages by using decision tests based on a combination of attributes. The same applies to model trees algorithms, in regression domains, but using linear models at leaf nodes. In this paper we study where to use combinations of attributes in regression and classification tree learning. We present an algorithm for multivariate tree learning that combines a univariate decision tree with a linear function by means of constructive induction. This algorithm is able to use decision nodes with multivariate tests, and leaf nodes that make predictions using linear functions. Multivariate decision nodes are built when growing the tree, while functional leaves are built when pruning the tree. The algorithm has been implemented both for classification problems and regression problems. The experimental evaluation shows that our algorithm has clear advantages with respect to the generalization ability when compared against its components, two simplified versions, and competes well against the state-of-the-art in multivariate regression and classification trees. © Springer-Verlag Berlin Heidelberg 2001.
2010
Authors
Shultz, TR; Fahlman, SE; Craw, S; Andritsos, P; Tsaparas, P; Silva, R; Drummond, C; Ling, CX; Sheng, VS; Drummond, C; Lanzi, PL; Gama, J; Wiegand, RP; Sen, P; Namata, G; Bilgic, M; Getoor, L; He, J; Jain, S; Stephan, F; Jain, S; Stephan, F; Sammut, C; Harries, M; Sammut, C; Ting, KM; Pfahringer, B; Case, J; Jain, S; Wagstaff, KL; Nijssen, S; Wirth, A; Ling, CX; Sheng, VS; Zhang, X; Sammut, C; Cancedda, N; Renders, J; Michelucci, P; Oblinger, D; Keogh, E; Mueen, A;
Publication
Encyclopedia of Machine Learning
Abstract
The access to the final selection minute is only available to applicants.
Please check the confirmation e-mail of your application to obtain the access code.