The choice of kernel function is a basic and challenging problem in researches on kernel methods. Gaussian kernel is a popular and widely used one in various kernel methods, and many universal kernel selection methods have been derived for Gaussian kernel. However, these methods may have some disadvantages, such as heavy computational complexity, the difficulty of algorithm implement, and the requirement of the classes generated from underlying multivariate normal distributions. To remedy these problems, generalized kernel polarization criterion has been proposed to tune the parameter of Gaussian kernel for classification tasks. By taking the within-class local structure into account and centering the kernel matrix, the criterion does better in maximizing the class separability in the feature space. And the final optimized kernel parameter leads to a substantial improvement in the performance. Furthermore, the criterion function can be proved to have a determined approximate global minimum point. This good characteristic, coupled with its independence of the actual learning machine, makes the optimal parameter easier to find by many algorithms. Besides this, local kernel polarization criterion function, a special case of generalized kernel polarization criterion function, can also be proved to have a determined approximate global minimum point. The extensions of generalized kernel polarization criterion and local kernel polarization criterion to the multiclass domain have been proposed. Experimental results show the effectiveness and efficiency of our proposed criteria.