Abstract:
Manifold learning is crucial in many research fields, such as pattern recognition, data mining, computer version, etc. However, there is little work focusing on developing a common framework which can unify all approaches. Meanwhile, since Laplacian eigenmap (LE) is a local manifold learning approach, it is very sensitive to the size of neighbors. Considering all kinds of manifold learning approaches, a novel unified manifold learning framework is proposed in this paper. It consists of two functional items, i.e., the maintaining item and the expecting item. Most approaches can be analyzed and improved within this framework. For illustration, LE is analyzed within the proposed framework. An improved Laplacian eigenmap (ILE) is then presented. It is mainly based on LE and maximum variance unfolding (MVU). The local character of graph Laplacian, which is referred to as maintaining item, is kept. The variances between any two points, which correspond to the expecting items, are maximized. ILE inherits the advantages of LE and MVU. Compared with LE, it is not so sensitive to the size of neighbors. And too strict local constraint of MVU is also relaxed. Moreover, ILE can also maintain the clustering property and discover the intrinsic character of original data. Several experiments on both toy examples and the real data sets are given for illustration.