Automated Feature Selection in Neuroevolution

Maxine Yen Ling TAN, Michael Hartley, Michel Bister, Rudi Deklerck

Onderzoeksoutput: Articlepeer review

19 Citaten (Scopus)


Feature selection is a task of great importance. Many feature selection methods have been proposed, and can be divided generally into two groups based on their dependence on the learning algorithm/classifier. Recently, a feature selection method that selects features at the same time as it evolves neural networks that use those features as inputs called Feature Selective NeuroEvolution of Augmenting Topologies (FS-NEAT) was proposed by Whiteson et al. In this paper, a novel feature selection method called Feature Deselective NeuroEvolution of Augmenting Topologies (FD-NEAT) is presented. FD-NEAT begins with fully connected inputs in its networks, and drops irrelevant or redundant inputs as evolution progresses. Herein, the performances of FD-NEAT, FS-NEAT and traditional NEAT are compared in some mathematical problems, and in a challenging race car simulator domain (RARS). On the whole, the results show that FD-NEAT significantly outperforms FS-NEAT in terms of network performance and feature selection, and evolves networks that offer the best compromise between network size and performance.
Originele taal-2English
Pagina's (van-tot)271-292
Aantal pagina's22
TijdschriftEvolutionary Intelligence
StatusPublished - feb 2009


Duik in de onderzoeksthema's van 'Automated Feature Selection in Neuroevolution'. Samen vormen ze een unieke vingerafdruk.

Citeer dit