Abstract:
A new interesting algorithm, iso-neighborhood projection (ISONP), is proposed for finding succinct representations in a supervised manner. Motivated by the fact that each class has its unique property and is independent from other classes, the recognition problem is cast as finding highly symmetric iso-neighborhood for all classes respectively, which is highly different from the traditional linear methods such as principal component analysis (PCA), linear discriminant analysis (LDA), and locality preserving projection (LPP). The traditional linear methods, like PCA, LDA and LPP, find a certain optimal projection and then map the training data into a lower space in a batch. Given labeled input data, ISONP discovers basis functions which can map each data into its corresponding neighborhood while keeping the intrinsic structure of each class at the same time. The basis functions span the lower subspace, and can be computed by a convex optimization problem: an L\-2-constrained least square problem. When recognizing a test sample, the authors map the new data into the spanned lower subspace and just compare the distance to the center of iso-neighborhoods instead of all training samples, which can enhance the recognizing speed. Experiments are conducted on several data sets and the results demonstrate the competence of the proposed algorithm.