Wang Qinggang, Li Jianwei. Fast Relaxed Algorithms of Maximum Variance Unfolding[J]. Journal of Computer Research and Development, 2009, 46(6): 988-994.
Citation:
Wang Qinggang, Li Jianwei. Fast Relaxed Algorithms of Maximum Variance Unfolding[J]. Journal of Computer Research and Development, 2009, 46(6): 988-994.
Wang Qinggang, Li Jianwei. Fast Relaxed Algorithms of Maximum Variance Unfolding[J]. Journal of Computer Research and Development, 2009, 46(6): 988-994.
Citation:
Wang Qinggang, Li Jianwei. Fast Relaxed Algorithms of Maximum Variance Unfolding[J]. Journal of Computer Research and Development, 2009, 46(6): 988-994.
1(Ministry of Education Key Laboratory of Optoelectronic Technology and Systems, College of Optoelectronic Engineering, Chongqing University, Chongqing 400030) 2(Mobile Professor Center, Chongqing Institute of Technology, Chongqing 400050)
Nonlinear dimensionality reduction is a challenging problem encountered in a variety of high dimensional data analysis, including machine learning, pattern recognition, scientific visualization, and neural computation. Based on the notion of local isometry, maximum variance unfolding (MVU) is proposed recently for nonlinear dimensionality reduction. By pulling the input patterns as far apart as possible subject to the strict local distance-preserving constraints, MVU can learn the faithful low dimensional structure of the manifold embedded in the high dimensional space. However, the final optimization of MVU needs to solve a semidefinite programming (SDP) problem; the huge computational and memory complexity makes MVU infeasible for large-scale implementations. To solve the problem, a fast algorithm of MVU, called relaxed MVU (RMVU), is proposed. In RMVU, originated from the approximate local structure preservation idea of Laplacian eigenmaps, the strict local distance-preserving constraints of MVU are relaxed. The optimization finally becomes a generalized eigen-decomposition problem and the computational intensity is significantly reduced. For addressing the more large-scale data sets, an improved RMVU, called landmark-based relaxed MVU (LRMVU), is also proposed. The theoretical analysis and experiments on the artificial data set and actual images show that the proposed algorithms RMVU and LRMVU can largely reduce the computational and memory complexity of MVU.
Li Shuying, Li Mei, Jiang Yuncheng, Wang Ju, Liu Zhenhuan. Fuzzy Description Logic L-ALCN[J]. Journal of Computer Research and Development, 2008, 45(4): 619-625.
Jiang Yuncheng, Shi Zhongzhi, Tang Yong, Wang Ju. A Distributed Dynamic Description Logic[J]. Journal of Computer Research and Development, 2006, 43(9): 1603-1608.