• 中国精品科技期刊
  • CCF推荐A类中文期刊
  • 计算领域高质量科技期刊T1类
Advanced Search
He Wenbin, Liu Qunfeng, Xiong Jinzhi. The Error Theory of Polynomial Smoothing Functions for Support Vector Machines[J]. Journal of Computer Research and Development, 2016, 53(7): 1576-1585. DOI: 10.7544/issn1000-1239.2016.20148462
Citation: He Wenbin, Liu Qunfeng, Xiong Jinzhi. The Error Theory of Polynomial Smoothing Functions for Support Vector Machines[J]. Journal of Computer Research and Development, 2016, 53(7): 1576-1585. DOI: 10.7544/issn1000-1239.2016.20148462

The Error Theory of Polynomial Smoothing Functions for Support Vector Machines

More Information
  • Published Date: June 30, 2016
  • Smoothing functions play an important role in the theory of smooth support vector machines. In 1996, Chen et al proposed a smoothing function of support vector machines—the integral function of Sigmoid function, and solved the error problem of the smoothing function. From 2005 to 2009, Yuan, Xiong and Liu proposed an infinite number of polynomial smoothing function and the corresponding reformulations for support vector machines. However, they did not touch the error functions for this class of polynomial smoothing functions. To fill up this gap, this paper studies the problem of the error functions with the Newton-Hermite interpolation method. The results show that: 1) the error functions of this class of polynomial smoothing functions can be calculated using the Newton-Hermite interpolation method, and the detailed algorithm is given; 2) there are an infinite number of error functions for this class of polynomial smoothing functions and a general formulation is obtained to describe these error functions; 3) there are several important properties for this class of error functions and the strict proof is given for these properties. By solving the problem of the error functions and their properties, this paper establishes an error theory of this class of polynomial smoothing functions, which is a basic theoretical support for smooth support vector machines.
  • Related Articles

    [1]Lei Xiangxin, Yang Zhiying, Huang Shaoyin, Hu Yunfa. Mining Frequent Subtree on Paging XML Data Stream[J]. Journal of Computer Research and Development, 2012, 49(9): 1926-1936.
    [2]Zhu Ranwei, Wang Peng, and Liu Majin. Algorithm Based on Counting for Mining Frequent Items over Data Stream[J]. Journal of Computer Research and Development, 2011, 48(10): 1803-1811.
    [3]Hu Wenyu, Sun Zhihui, Wu Yingjie. Study of Sampling Methods on Data Mining and Stream Mining[J]. Journal of Computer Research and Development, 2011, 48(1): 45-54.
    [4]Yang Bei, Huang Houkuan. Mining Top-K Significant Itemsets in Landmark Windows over Data Streams[J]. Journal of Computer Research and Development, 2010, 47(3): 463-473.
    [5]Xu Zhen, Sha Chaofeng, Wang Xiaoling, Zhou Aoying. A Semi-Supervised Learning Algorithm from Imbalanced Data Based on KL Divergence[J]. Journal of Computer Research and Development, 2010, 47(1): 81-87.
    [6]Mao Guojun and Zong Dongjun. An Intrusion Detection Model Based on Mining Multi-Dimension Data Streams[J]. Journal of Computer Research and Development, 2009, 46(4): 602-609.
    [7]Wang Tao, Li Zhoujun, Yan Yuejin, Chen Huowang. A Survey of Classification of Data Streams[J]. Journal of Computer Research and Development, 2007, 44(11): 1809-1815.
    [8]Liu Xuejun, Xu Hongbing, Dong Yisheng, Qian Jiangbo, Wang Yongli. Mining Frequent Closed Patterns from a Sliding Window over Data Streams[J]. Journal of Computer Research and Development, 2006, 43(10): 1738-1743.
    [9]Liu Xuejun, Xu Hongbing, Dong Yisheng, Wang Yongli, Qian Jiangbo. Mining Frequent Patterns in Data Streams[J]. Journal of Computer Research and Development, 2005, 42(12): 2192-2198.
    [10]Yang Yidong, Sun Zhihui, Zhang Jing. Finding Outliers in Distributed Data Streams Based on Kernel Density Estimation[J]. Journal of Computer Research and Development, 2005, 42(9): 1498-1504.

Catalog

    Article views (1150) PDF downloads (468) Cited by()

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return