Adaptive Neighborhood Selection Based on Local Linearity for Manifold Learning
-
Graphical Abstract
-
Abstract
Recently, manifold learning has attracted extensive interests of researchers from machine learning, pattern recognition, computer vision and related communities. Constructing a reasonable neighborhood for each point is a key issue in manifold learning. Conventional neighborhood selection algorithms k nearest neighbors (k-NN) and ε-neighborhood need to specifiy the parameter manually beforehand which most manifold learning algorithms are sensitive to. In this paper, an adaptive neighborhood selection algorithm based on local linearity called ANSLL is presented. Firstly, two principles of construction neighborhood for most existing manifold learning algorithms are summarized as: 1) data points in the same neighborhood should approximately lie on a d-dimensional linear subspace (d is the dimensionality of data manifold) and 2) each neighborhood should be as large as possible. Then based on these two principles, ANSLL exploits principal component analysis (PCA) technique to measure the linearity of finite data points set and constructs neighborhood for each point via neighborhood contraction or expansion. In addition, an improved method of constructing neighborhood graph, which can improve the accuracy of geodesic distance estimation for isometric embedding, is also proposed. Finally, extensive and systematic experiments on both synthetic data sets and real world data sets demonstrate that the ANSLL algorithm can adaptively tune neighborhood size according to local curvature of data manifold and then improve the performance of most existing manifold algorithms, such as Isomap and LLE.
-
-