Normalized Mutual Information Feature Selection
Top Cited Papers
- 13 January 2009
- journal article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Transactions on Neural Networks
- Vol. 20 (2), 189-201
- https://doi.org/10.1109/tnn.2008.2005601
Abstract
A filter method of feature selection based on mutual information, called normalized mutual information feature selection (NMIFS), is presented. NMIFS is an enhancement over Battiti's MIFS, MIFS-U, and mRMR methods. The average normalized mutual information is proposed as a measure of redundancy among features. NMIFS outperformed MIFS, MIFS-U, and mRMR on several artificial and benchmark data sets without requiring a user-defined parameter. In addition, NMIFS is combined with a genetic algorithm to form a hybrid filter/wrapper method called GAMIFS. This includes an initialization procedure and a mutation operator based on NMIFS to speed up the convergence of the genetic algorithm. GAMIFS overcomes the limitations of incremental search algorithms that are unable to find dependencies between groups of features.Keywords
This publication has 32 references indexed in Scilit:
- Maximally Informative Feature and Sensor Selection in Pattern Recognition Using Local and Global Independent Component AnalysisJournal of Signal Processing Systems, 2007
- AMIFS: adaptive feature selection by using mutual informationPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2005
- Estimating mutual informationPhysical Review E, 2004
- Relevant, Irredundant Feature Selection and Noisy Example EliminationIEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics), 2004
- Input feature selection by mutual information based on Parzen windowIEEE Transactions on Pattern Analysis and Machine Intelligence, 2002
- Input feature selection for classification problemsIEEE Transactions on Neural Networks, 2002
- Partial BFGS Update and Efficient Step-Length Calculation for Three-Layer Neural NetworksNeural Computation, 1997
- Using mutual information for selecting features in supervised neural net learningIEEE Transactions on Neural Networks, 1994
- A note on genetic algorithms for large-scale feature selectionPattern Recognition Letters, 1989
- Induction of decision treesMachine Learning, 1986