没有合适的资源?快使用搜索试试~ 我知道了~
资源推荐
资源详情
资源评论
Support
Vector
Method
for
Function
Approximation,
Regression
Estimation,
and
Signal
Processing·
Vladimir
Vapnik
AT&T
Research
101 Crawfords Corner
Holmdel, N J 07733
vlad@research.att.com
Steven
E.
Golowich
Bell Laboratories
700
Mountain
Ave.
Murray
Hill, NJ 07974
golowich@bell-Iabs.com
Abstract
Alex
Smola·
GMD
First
Rudower Shausee 5
12489 Berlin
asm@big.att.com
The
Support
Vector (SV)
method
was recently proposed for es-
timating
regressions, constructing multidimensional splines,
and
solving linear
operator
equations [Vapnik, 1995]. In
this
presenta-
tion
we
report
results
of
applying
the
SV
method
to
these problems.
1
Introduction
The
Support
Vector
method
is a universal
tool
for solving multidimensional function
estimation
problems. Initially it was designed
to
solve
pattern
recognition problems,
where in order
to
find a decision rule with good generalization ability one selects
some (small) subset
of
the
training
data
, called
the
Support
Vectors (SVs).
Optimal
separation
of
the
SV s is equivalent
to
optimal
separation
the
entire
data.
This
led
to
a new
method
of
representing decision functions where
the
decision
functions are a linear expansion on a basis whose elements are nonlinear functions
parameterized by
the
SVs (we need one SV for each element
of
the
basis).
This
type
of
function representation is especially useful for high dimensional
input
space:
the
number
of
free
parameters
in this representation is
equal
to
the
number
of
SVs
but
does
not
depend
on
the
dimensionality
of
the
space.
Later
the
SV
method
was extended
to
real-valued functions.
This
allows us
to
expand
high-dimensional functions using a
small
basis constructed from SVs.
This
·smola@prosun.first.gmd.de
282
v.
Vapnik,
S.
E.
Golowich
and
A. Smola
novel
type
of
function representation opens new opportunities for solving various
problems
of
function
approximation
and
estimation.
In
this
paper
we
demonstrate
that
using
the
SV technique one can solve problems
that
in classical techniques would require
estimating
a large number
of
free
param-
eters.
In
particular
we
construct one
and
two dimensional splines with
an
arbitrary
number
of
grid points. Using linear splines
we
approximate
non-linear functions.
We show
that
by reducing requirements on
the
accuracy
of
approximation, one de-
creases
the
number
of
SVs which leads
to
data
compression. We also show
that
the
SV technique
is
a useful tool for regression
estimation.
Lastly
we
demonstrate
that
using
the
SV function representation for solving inverse ill-posed problems provides
an additional
opportunity
for regularization.
2
SV
method
for
estimation
of
real
functions
Let x E R
n
and
Y E
Rl.
Consider
the
following set
of
real functions: a vector x
is
mapped
into some a priori chosen Hilbert space, where
we
define functions
that
are
linear in
their
parameters
00
Y =
I(x,w)
= L
Wi<Pi(X),
W =
(WI,
...
,WN,
... ) E n
(1)
i=1
In
[Vapnik, 1995]
the
following
method
for
estimating
functions in
the
set (1) based
on
training
data
(Xl, Yd,
..
. ,
(Xl,
Yl)
was suggested: find
the
function
that
minimizes
the
following functional:
1
l
R(w) = £ L
IYi
- I(Xi,
w)lt:
+ I(w, w),
(2)
i=1
where
{
0
if
Iy
-
I(x,
w)1
<
£,
Iy
-
I(x,
w)lt:
=
Iy
-
I(x,
w)l-
£ otherwise,
(3)
(w,
w)
is
the
inner
product
of
two vectors,
and
I is some constant.
It
was shown
that
the
function minimizing this functional
has
a form:
l
I(x,
a, a*) =
L(a;
-
ai)(<I>(xi),
<I>(x))
+ b
(4)
;=1
where
ai,
ai
2::
0 with
aiai
= 0
and
(<I>(Xi),
<I>(x»
is
the inner
product
of
two
elements
of
Hilbert space.
To find
the
coefficients
a;
and
ai one
has
to
solve
the
following quadratic optimiza-
tion problem: maximize
the
functional
ill
W(a*, a) =
-£
L(a;
+ai)+
Ly(a;
-ai)-~
L (a;
-ai)(aj
-aj
)(<I>(Xi),
<I>(Xj)),
i=1
;=1
i,j=1
subject
to
constraints
l
L(ai-ai)=O,
O~ai,a;~C,
i=l,
...
,f.
i=1
(5)
(6)
剩余6页未读,继续阅读
资源评论
yezi_1026
- 粉丝: 24
- 资源: 15
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
- 自动上下料非标气密检测机(sw20可编辑+工程图)全套技术资料100%好用.zip
- 最佳图像分割【Matlab代码】
- (178286042)机器学习相关的ppt,制作精良
- 最大类间方差(Otsu)图像分割【Matlab程序】
- Matlab肺结节分割(肺结节提取)源程序,也有GUI人机界面版本 使用传统图像分割方法,非深度学习方法 使用LIDC-IDRI数据集 工作如下: 1、读取图像 读取原始dicom格式的CT图
- Matlab实现RIME-TCN-Multihead-Attention霜冰算法优化时间卷积网络结合多头注意力机制多变量时间序列预测(含完整的程序,GUI设计和代码详解)
- 基于Bersen算法的图像分割【Matlab程序】
- (178412806)运动荟小程序.rar
- (178604640)基于stm32f407vet6的智能小车
- 基于springboot的汉服推广网站源码(java毕业设计完整源码+LW).zip
- (180204842)输电线路悬垂线夹耐张线夹检测图像数据集
- (180384612)Python数据结构与算法-PPT课件.rar
- 基于springboot的家具网站源码(java毕业设计完整源码).zip
- Matlab基于OOA-SVR鱼鹰算法优化支持向量机的数据多输入单输出回归预测(含完整的程序,GUI设计和代码详解)
- 基于springboot的酒店管理系统源码(java毕业设计完整源码+LW).zip
- Matlab基于OOA-LSSVM鱼鹰算法优化最小二乘支持向量机的数据多输入单输出回归预测(含完整的程序,GUI设计和代码详解)
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功