LS-SVMlab Toolbox User’s Guide
version 1.5
K. Pelckmans, J.A.K. Suykens, T. Van Gestel, J. De Brabanter,
L. Lukas, B. Hamers, B. De Moor, J. Vandewalle
Katholieke Universiteit Leuven
Department of Electrical Engineering, ESAT-SCD-SISTA
Kasteelpark Arenberg 10, B-3001 Leuven-Heverlee, Belgium
{ kristiaan.pelckmans, johan.suykens }@esat.kuleuven.ac.be
http://www.esat.kuleuven.ac.be/sista/lssvmlab/
ESAT-SCD-SISTA Technical Report 02-145
February 2003
Acknowledgements
Research supported by Research Council K.U.Leuven: GOA-Mefisto 666, IDO (IOTA
oncology, genetic networks), several PhD/postdoc & fellow grants; Flemish Govern-
ment: FWO: PhD/postdoc grants, G.0407.02 (support vector machines), projects
G.0115.01 (microarrays/oncology), G.0240.99 (multilinear algebra), G.0080.01 (col-
lective intelligence), G.0413.03 (inference in bioi), G.0388.03 (microarrays for clinical
use), G.0229.03 (ontologies in bioi), G.0197.02 (power islands), G.0141.03 (identifi-
cation and cryptography), G.0491.03 (control for intensive care glycemia), G.0120.03
(QIT), research communities (ICCoS, ANMMM); AWI: Bil. Int. Collaboration Hun-
gary, Poland, South Africa; IWT: PhD G rants, STWW-Genprom (gene promotor
prediction), GBOU-McKnow (knowledge management algorithms), GBOU-SQUAD
(quorum sensing), GBOU-ANA (biosensors); Soft4s (softsensors) Belgian Federal Gov-
ernment: DWTC (IUAP IV-02 (1996-2001) and IUAP V-22 (2002-2006)); PODO-II
(CP/40: TMS and sustainibility); EU: CAGE; ERNSI; Eureka 2063-IMPACT; Eureka
2419-FliTE; Contract Research/agreements: Data4s, Electrabel, Elia, LMS, IPCOS,
VIB; JS is a professor at K.U.Leuven Belgium and a postdoctoral researcher with FWO
Flanders. TVG is postdoctoral researcher with FWO Flanders. BDM and JWDW are
full professors at K.U.Leuven Belgium.
1
Contents
1 Introduction 4
2 A birds eye view on LS-SVMlab 5
2.1 Classification and Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.1.1 Classification Extensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.1.2 Tuning, Sparseness, Robustness . . . . . . . . . . . . . . . . . . . . . . . . . 6
2.1.3 Bayesian Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.2 NARX Models and Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.3 Unsupervised Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.4 Solving Large Scale Problems with Fixed Size LS-SVM . . . . . . . . . . . . . . . . 9
3 LS-SVMlab toolbox examples 10
3.1 Classification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.1.1 Hello world... . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.1.2 The Ripley data set . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
3.1.3 Bayesian Inference for Classification . . . . . . . . . . . . . . . . . . . . . . 14
3.1.4 Multi-class co ding . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16
3.2 Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.2.1 A Simple Sinc Example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
3.2.2 Bayesian Inference for Regression . . . . . . . . . . . . . . . . . . . . . . . . 19
3.2.3 Using the object oriented model interface . . . . . . . . . . . . . . . . . . . 20
3.2.4 Robust Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
3.2.5 Multiple Output Regression . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
3.2.6 A Time-Series Example: Santa Fe Laser Data Prediction . . . . . . . . . . 24
3.2.7 Fixed size LS-SVM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
3.3 Unsupervised Learning using kernel based Principal Component Analysis . . . . . 28
A MATLAB functions 29
A.1 General Notation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
A.2 Index of Function Calls . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
A.2.1 Training and Simulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
A.2.2 Object Oriented Interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
A.2.3 Training and Simulating Functions . . . . . . . . . . . . . . . . . . . . . . . 32
A.2.4 Kernel Functions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
A.2.5 Tuning, Sparseness and Robustness . . . . . . . . . . . . . . . . . . . . . . . 34
A.2.6 Classification Extensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
A.2.7 Bayesian Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
A.2.8 NARX mo dels and Prediction . . . . . . . . . . . . . . . . . . . . . . . . . . 37
A.2.9 Unsupervised learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
A.2.10 Fixed Size LS-SVM . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
A.2.11 Demos . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
A.3 Alphabetical List of Function Calls . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
2
A.3.1 AFE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
A.3.2 bay
errorbar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
A.3.3 bay initlssvm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
A.3.4 bay lssvm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
A.3.5 bay lssvmARD . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
A.3.6 bay modoutClass . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
A.3.7 bay optimize . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
A.3.8 bay rr . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
A.3.9 code, codelssvm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
A.3.10 crossvalidate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58
A.3.11 deltablssvm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 61
A.3.12 denoise kpca . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
A.3.13 eign . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
A.3.14 initlssvm, changelssvm . . . . . . . . . . . . . . . . . . . . . . . . . . . . 65
A.3.15 kentropy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 68
A.3.16 kernel
matrix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
A.3.17 kpca . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 70
A.3.18 latentlssvm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
A.3.19 leaveoneout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 73
A.3.20 leaveoneout
lssvm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75
A.3.21 lin kernel, MLP kernel, poly kernel, RBF kernel . . . . . . . . . . . . 77
A.3.22 linf, mae, medae, misclass, mse, trimmedmse . . . . . . . . . . . . . 78
A.3.23 plotlssvm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
A.3.24 predict . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
A.3.25 prelssvm, postlssvm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
A.3.26 rcrossvalidate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
A.3.27 ridgeregress . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 87
A.3.28 robustlssvm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
A.3.29 roc . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89
A.3.30 simlssvm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 91
A.3.31 sparselssvm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
A.3.32 trainlssvm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94
A.3.33 tunelssvm, linesearch & gridsearch . . . . . . . . . . . . . . . . . . . . 96
A.3.34 validate . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100
A.3.35 windowize & windowizeNARX . . . . . . . . . . . . . . . . . . . . . . . . . . 102
3
Chapter 1
Introduction
Support Vector Machines (SVM) is a powerful methodology for solving problems in nonlinear
classification, function estimation and density estimation which has also led to many other recent
developments in kernel based learning methods in general [3, 16, 17, 34, 33]. SVMs have been in-
troduced within the context of statistical learning theory and structural risk minimization. In the
methods one solves convex optimization problems, typically quadratic programs. Least Squares
Support Vector Machines (LS-SVM) are reformulations to standard SVMs [21, 28] which lead
to solving linear KKT systems. LS-SVMs are closely related to regularization networks [5] and
Gaussian processes [37] but additionally emphasize and exploit primal-dual interpretations. Links
between kernel versions of classical pattern recognition algorithms such as kernel Fisher discrim-
inant analysis and extensions to unsupervised learning, recurrent networks and control [22] are
available. Robustness, sparseness and weightings [23] can be imposed to LS-SVMs where needed
and a Bayesian framework with three levels of inference has been developed [29, 32]. LS-SVM
alike primal-dual formulations are given to kernel PCA [24], kernel CCA and kernel PLS [25]. For
ultra large scale problems and on-line learning a method of Fixed Size LS-SVM is proposed, which
is related to a Nystr¨om sampling [6, 35] with active selection of support vectors and estimation in
the primal space.
The present LS-SVMlab toolbox User’s G uide contains Matlab/C implementations for a num-
ber of LS-SVM algorithms related to classification, regression, time-series prediction and unsuper-
vised learning. References to commands in the toolbox are written in typewriter font.
A main reference and overview on least squares support vector machines is
J.A.K. Suykens, T. Van Gestel, J. De Brabanter, B. De Moor, J. Vandewalle,
Least Squares Support Vector Machines,
World Scientific, Singapore, 2002 (ISBN 981-238-151-1).
The LS-SVMlab homepage is
http://www.esat.kuleuven.ac.be/sista/lssvmlab/
The LS-SVMlab to olbox is made available under the GNU general license p olicy:
Copyright (C) 2002 KULeuven-ESAT-SCD
This program is free software; you can redistribute it and/or modify it under the terms
of the GNU General Public License as published by the Free Software Foundation;
either version 2 of the License, or (at your option) any later version.
This program is distributed in the hope that it will be useful, but WITHOUT ANY
WARRANTY; without even the implied warranty of MERCHANTABILITY or FIT-
NESS FOR A PARTICULAR PURPOSE. See the website of LS-SVMlab or the GNU
General Public License for a copy of the GNU General Public License specifications.
4
- 1
- 2
- 3
前往页