没有合适的资源?快使用搜索试试~ 我知道了~
资源推荐
资源详情
资源评论
Support
Vector
Method
for
Function
Approximation,
Regression
Estimation,
and
Signal
Processing·
Vladimir
Vapnik
AT&T
Research
101 Crawfords Corner
Holmdel, N J 07733
vlad@research.att.com
Steven
E.
Golowich
Bell Laboratories
700
Mountain
Ave.
Murray
Hill, NJ 07974
golowich@bell-Iabs.com
Abstract
Alex
Smola·
GMD
First
Rudower Shausee 5
12489 Berlin
asm@big.att.com
The
Support
Vector (SV)
method
was recently proposed for es-
timating
regressions, constructing multidimensional splines,
and
solving linear
operator
equations [Vapnik, 1995]. In
this
presenta-
tion
we
report
results
of
applying
the
SV
method
to
these problems.
1
Introduction
The
Support
Vector
method
is a universal
tool
for solving multidimensional function
estimation
problems. Initially it was designed
to
solve
pattern
recognition problems,
where in order
to
find a decision rule with good generalization ability one selects
some (small) subset
of
the
training
data
, called
the
Support
Vectors (SVs).
Optimal
separation
of
the
SV s is equivalent
to
optimal
separation
the
entire
data.
This
led
to
a new
method
of
representing decision functions where
the
decision
functions are a linear expansion on a basis whose elements are nonlinear functions
parameterized by
the
SVs (we need one SV for each element
of
the
basis).
This
type
of
function representation is especially useful for high dimensional
input
space:
the
number
of
free
parameters
in this representation is
equal
to
the
number
of
SVs
but
does
not
depend
on
the
dimensionality
of
the
space.
Later
the
SV
method
was extended
to
real-valued functions.
This
allows us
to
expand
high-dimensional functions using a
small
basis constructed from SVs.
This
·smola@prosun.first.gmd.de
资源评论
yezi_1026
- 粉丝: 24
- 资源: 15
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功