×

The bounds on the rate of uniform convergence of learning process based on complex random samples. (English) Zbl 1170.68574

Summary: Statistical learning theory is commonly regarded as a sound framework within which we handle a variety of learning problems in presence of small size data samples. It has become a rapidly progressing research area in machine learning. The theory is based on real random samples and as such is not ready to deal with the statistical learning problems involving complex random samples, which we may encounter in real world scenarios. This paper explores statistical learning theory based on complex random samples and in this sense generalizes the existing fundamentals.
Firstly, the definitions of complex random variable, primary norm and linear functional are introduced. Secondly, the definitions of the complex empirical risk functional, the complex expected risk functional, and complex empirical risk minimization principle are proposed. Thirdly, the concepts of annealed entropy, growth function and VC dimension of complex measurable functions are proposed, and some important properties are proved. Finally, given these definitions and derived properties, the bounds on the rate of uniform convergence of learning process based on complex random samples are constructed.

MSC:

68T05 Learning and adaptive systems in artificial intelligence
68T10 Pattern recognition, speech recognition