This book focuses on Least Squares Support Vector Machines (LS-SVMs) which are reformulations to standard SVMs. LS-SVMs are closely related to regularization networks and Gaussian processes but additionally emphasize and exploit primal-dual interpretations from optimization theory. The authors explain the natural links between LS-SVM classifiers and kernel Fisher discriminant analysis. Bayesian inference of LS-SVM models is discussed, together with methods for imposing sparseness and employing robust statistics.The framework is further extended towards unsupervised learning by considering PCA analysis and its kernel version as a one-class modelling problem. This leads to new primal-dual support vector machine formulations for kernel PCA and kernel CCA analysis. Furthermore, LS-SVM formulations are given for recurrent networks and control. In general, support vector machines may pose heavy computational challenges for large data sets. For this purpose, a method of fixed size LS-SVM is proposed where the estimation is done in the primal space in relation to a Nyström sampling with active selection of support vectors. The methods are illustrated with several examples.
- ISBN13 9789812381514
- Publish Date 14 November 2002 (first published 1 January 2002)
- Publish Status Active
- Publish Country SG
- Imprint World Scientific Publishing Co Pte Ltd
- Format Hardcover
- Pages 308
- Language English
- URL https://worldscientific.com/worldscibooks/10.1142/5089