Advanced Search
    Wang Jun, Wei Jinmao, Zhang Lu. Multi-Task Feature Learning Algorithm Based on Preserving Classification Information[J]. Journal of Computer Research and Development, 2017, 54(3): 537-548. DOI: 10.7544/issn1000-1239.2017.20150963
    Citation: Wang Jun, Wei Jinmao, Zhang Lu. Multi-Task Feature Learning Algorithm Based on Preserving Classification Information[J]. Journal of Computer Research and Development, 2017, 54(3): 537-548. DOI: 10.7544/issn1000-1239.2017.20150963

    Multi-Task Feature Learning Algorithm Based on Preserving Classification Information

    • In pattern recognition, feature selection is an effective technique for dimension reduction. Feature evaluation criteria are utilized for assessing the importance of features. However, there are several shortcomings for currently available criteria. Firstly, these criteria commonly concentrate all along on class separability, whereas class correlation information is ignored in the selection process. Secondly, they are hardly capable of reducing feature redundancy specific to classification. And thirdly, they are often exploited in univariate measurement and unable to achieve global optimality for feature subset. In this work, we introduce a novel feature evaluation criterion called CIP (classification information preserving). CIP is on the basis of preserving classification information, and multi-task learning technology is adopted for formulating and realizing it. Furthermore, CIP is a feature subset selection method. It employs Frobenius norm for minimizing the difference of classification information between the selected feature subset and original data. Also l2,1 norm is used for constraining the number of the selected features. Then the optimal solution of CIP is achieved under the framework of the proximal alternating direction method. Both theoretical analysis and experimental results demonstrate that the optimal feature subset selected by CIP maximally preserves the original class correlation information. Also feature redundancy for classification is reduced effectively.
    • loading

    Catalog

      Turn off MathJax
      Article Contents

      /

      DownLoad:  Full-Size Img  PowerPoint
      Return
      Return