Feature selection is one of the fundamental problems in machine learning. Not only can its proper design reduce system complexity and processing time, but it can also enhance system performance in many cases. It becomes even more critical to the success of a machine learning algorithm in problems involving a large amount of irrelevant features. Potential support vector machine (P-SVM), as a new wrapper feature selection approach, has been applied to several fields successfully. However, according to Fisher linear discriminant criterion, it is found that P-SVM can work only when the mean of each class is zero, which makes it difficult to get best decision boundaries for sample data and therefore lowers classification capability of P-SVM. In this paper, based on the above mentioned finding, a new criterion function with general within-class scatter is adopted and a generalized potential support features selection method (GPSFM) is proposed, which not only has the advantages of P-SVM to some extent but also has the characteristics of low redundant features selection, high selection speed, and nicer adaptive abilities. So compared with the traditional P-SVM, this new method has much stronger abilities in both feature selection and classification. Our experimental results demonstrate the above advantages of the proposed method GPSFM.