×

Fast rates for support vector machines. (English) Zbl 1137.68564

Auer, Peter (ed.) et al., Learning theory. 18th annual conference on learning theory, COLT 2005, Bertinoro, Italy, June 27–30, 2005. Proceedings. Berlin: Springer (ISBN 3-540-26556-2/pbk). Lecture Notes in Computer Science 3559. Lecture Notes in Artificial Intelligence, 279-294 (2005).
Summary: We establish learning rates to the Bayes risk for support vector machines (SVMs) using a regularization sequence \(\lambda_{n} = n^{-\alpha}\), where \(\alpha \in (0,1)\) is arbitrary. Under a noise condition recently proposed by Tsybakov these rates can become faster than \(n^{-1/2}\). In order to deal with the approximation error we present a general concept called the approximation error function which describes how well the infinite sample versions of the considered SVMs approximate the data-generating distribution. In addition we discuss in some detail the relation between the “classical” approximation error and the approximation error function. Finally, for distributions satisfying a geometric noise assumption we establish some learning rates when the used RKHS is a Sobolev space.
For the entire collection see [Zbl 1076.68003].

MSC:

68T05 Learning and adaptive systems in artificial intelligence
Full Text: DOI