Abstract:
Statistical learning theory is commonly regarded as a sound framework that handles a variety of learning problems in presence of small size data samples. However, statistical learning theory is based on real random samples (real number valued random variables) and as such is not ready to deal with the statistical learning problems involving complex random samples (complex number valued random variables), which can be encountered in real world scenarios. Structural risk minimization principle is one of the kernel contents of statistical learning theory. The principle is the theoretical fundamental of establishing the support vector machine. Based on the above, the structural risk minimization principle of statistical learning theory based on complex random samples is explored. Firstly, the definitions of annealed entropy, growth function and VC dimension of a set of complex measurable functions are proposed and some important properties are proved. Secondly, the bounds on the rate of uniform convergence of learning process based on complex random samples are constructed. Finally, the structural risk minimization principle of complex random samples is proposed. The consistency of this principle is proven, and asymptotic bounds on the rate of convergence are derived. The investigations will help lay essential theoretical foundations for establishing support vector machine based on complex random samples.