A maximum vector-angular margin classification mechanism, called MAMC, is proposed in this paper. Its core idea is to find a vector c in patterns feature space, which is as close as possible to the center of the training samples for the smaller VC dimension, such that all the data points can be classified in terms of the maximum vector-angular margin ρ between the vector c and all the training points. The proposed approach MAMC can not only be kernelized to enhance its flexibility, but also be simply realized by solving a corresponding convex optimization problem. Furthermore, its v×v\-1 parameter property is respectively the lower bound of support vectors and the upper bound of misclassified patterns, which is similar to the v-SVC algorithm. Meanwhile, the corresponding hard margin version can be equivalently formulated as a special and kernelized minimum enclosing ball (MEB), called the center constraint MEB (CC-MEB), thus the MAMC may be extended to another version for training on large datasets by using generalized core vector machine (CVM). Experimental results about artificial and real datasets illustrate that the obtained effectiveness of the proposed method is competitive.