Handling Continuous Attributes in an Evolutionary Inductive Learner.

F. Divina, E. Marchiori

Research output: Contribution to JournalArticleAcademicpeer-review

Abstract

This paper analyzes experimentally discretization algorithms for handling continuous attributes in evolutionary learning. We consider a learning system that induces a set of rules in a fragment of first-order logic (evolutionary inductive logic programming), and introduce a method where a given discretization algorithm is used to generate initial inequalities, which describe subranges of attributes' values. Mutation operators exploiting information on the class label of the examples (supervised discretization) are used during the learning process for refining inequalities. The evolutionary learning system is used as a platform for testing experimentally four algorithms: Two variants of the proposed method, a popular supervised discretization algorithm applied prior to induction, and a discretization method which does not use information on the class labels of the examples (unsupervised discretization). Results of experiments conducted on artificial and real life datasets suggest that the proposed method provides an effective and robust technique for handling continuous attributes by means of inequalities. © 2005 IEEE.
Original languageEnglish
Pages (from-to)32-43
JournalIEEE Transactions on Evolutionary Computation
Volume9
Issue number1
DOIs
Publication statusPublished - 2005

Fingerprint

Dive into the research topics of 'Handling Continuous Attributes in an Evolutionary Inductive Learner.'. Together they form a unique fingerprint.

Cite this