Mutual Information: an Adequate Tool for Feature Selection?

Lecture
Date: 
18 March 2014
Begin time: 
14:15
Room: 
CITEC 2.017

In the field of machine learning, mutual information (MI) has been widely used as a multivariate criterion to select relevant features.  However, we have shown in several of our recent works that selecting the feature subset which maximises MI is not necessarily optimal.  This talk first introduces MI from an information theory point of view.  Then, the interest and use of MI for feature selection is discussed.  We will study under which conditions this criterion can be safely used in classification and regression, showing that mutual information is in general a good choice, even if counterexamples can be obtained.  Both theoretical and experimental results will support the discussion.