Advanced Search
    Dong Hongbin, Teng Xuyang, Yang Xue. Feature Selection Based on the Measurement of Correlation Information Entropy[J]. Journal of Computer Research and Development, 2016, 53(8): 1684-1695. DOI: 10.7544/issn1000-1239.2016.20160172
    Citation: Dong Hongbin, Teng Xuyang, Yang Xue. Feature Selection Based on the Measurement of Correlation Information Entropy[J]. Journal of Computer Research and Development, 2016, 53(8): 1684-1695. DOI: 10.7544/issn1000-1239.2016.20160172

    Feature Selection Based on the Measurement of Correlation Information Entropy

    • Feature selection aims to select a smaller feature subset from the original feature set. The subset can provide the approximate or better performance in data mining and machine learning. Without transforming physical characteristics of features, fewer features give a more powerful interpretation. Traditional information-theoretic methods tend to measure features relevance and redundancy separately and ignore the combination effect of the whole feature subset. In this paper, the correlation information entropy is applied to feature selection, which is a technology in data fusion. Based on this method, we measure the degree of the independence and redundancy among features. Then the correlation matrix is constructed by utilizing the mutual information between features and their class labels and the combination of feature pairs. Besides, with the consideration of the multivariable correlation of different features in subset, the eigenvalue of the correlation matrix is calculated. Therefore, the sorting algorithm of features and an adaptive feature subset selection algorithm combining with the parameter are proposed. Experiment results show the effectiveness and efficiency on classification tasks of the proposed algorithms.
    • loading

    Catalog

      Turn off MathJax
      Article Contents

      /

      DownLoad:  Full-Size Img  PowerPoint
      Return
      Return