高级检索

    人脸识别中适合于小样本情况下的监督化拉普拉斯判别分析

    Supervised Laplacian Discriminant Analysis for Small Sample Size Problem with Its Application to Face Recognition

    • 摘要: 提取有效特征对高维数据的模式分类起着关键的作用.无监督判别投影,通过最大化非局部散度和局部散度之比,在数据降维和特征提取上表现出较好的性能,但是它是一种非监督学习算法,并且存在小样本问题.针对这些问题,提出了监督化拉普拉斯判别分析,算法在考虑非局部散度和局部散度时考虑了样本的类别信息;通过丢弃总体拉普拉斯散度矩阵的零空间,并将类内拉普拉斯散度矩阵投影到总体拉普拉斯散度矩阵的主空间中,然后在该空间中进行特征问题的求解,从而避免了小样本问题.通过理论分析,该算法没有任何判别信息损失,同时在计算上效率也较高.在人脸识别上的实验验证了算法的正确性和有效性.

       

      Abstract: For high-dimensional data, extraction of effective features is important for pattern recognition. Unsupervised discriminant projection shows desirable performance by maximizing the ratio of non-local scatter to the local scatter, but it is an unsupervised method and suffers from the singularity problem, which is also called the small sample size (SSS) problem. To solve these problems, supervised Laplacian discriminant analysis (SLDA) is proposed, which takes the class information into account while calculating the local and non-local scatter matrix. The null space of total Laplacian scatter matrix is discarded firstly, then the intra-class Laplacian scatter matrix is projected onto the range space of the total Laplacian scatter matrix, and the solution is reduced to the eigen-problem in that space, thus the SSS is artfully avoided. Theoretical analysis shows that no discriminative information is lost and it is also computational efficient. Experiments on face recognition confirm the correctness and effectiveness of the proposed algorithm.

       

    /

    返回文章
    返回