Advanced Search
    Wu Gaowei, Tao Qing, Wang Jue. Support Vector Machines Based on Posteriori Probability[J]. Journal of Computer Research and Development, 2005, 42(2): 196-202.
    Citation: Wu Gaowei, Tao Qing, Wang Jue. Support Vector Machines Based on Posteriori Probability[J]. Journal of Computer Research and Development, 2005, 42(2): 196-202.

    Support Vector Machines Based on Posteriori Probability

    • To solve uncertain classification problem, an SVM (support vector machine) is trained to behave like a Bayesian optimal classifier based on the training data. The idea is to weigh each unbalanced training sample by a posteriori probability. A whole framework of posteriori probability support vector machine (PPSVM) is presented and SVM is reformulated into PPSVM. The linear separability, margin, optimal hyperplane and soft margin algorithms are discussed. A new optimization problem is obtained and a new definition of support vector is given. In fact, PPSVM is motivated by statistical learning theory and is an extension of regular SVM. An empirical method is also proposed for determining the posteriori probability. Two artificial examples show that PPSVM formulation is reasonable if the class-conditional probability is known, and some real experiments demonstrate that the weighted data cases by some empirical methods can produce better results than regular SVM.
    • loading

    Catalog

      Turn off MathJax
      Article Contents

      /

      DownLoad:  Full-Size Img  PowerPoint
      Return
      Return