% Practicum, Task #3, 'Compositions of algorithms'.
%
% FUNCTION:
% [model] = gradient_boosting_train (X, y, num_iterations, base_algorithm, loss, ...
% param_name1, param_value1, param_name2, param_value2, ...
% param_name3, param_value3, param_name3, param_value3)
%
% DESCRIPTION:
% This function train the composition of algorithms using gradient boosting method.
%
% INPUT:
% X --- matrix of objects, N x K double matrix, N --- number of objects,
% K --- number of features.
% y --- vector of answers, N x 1 double vector, N --- number of objects. y
% can have only two values --- +1 and -1 in case of classification
% and all possible double values in case of regression.
% num_iterations --- the number ob algorithms in composition, scalar.
% base_algorithm --- the base algorithm, string. Can have one of two
% values: 'regression_tree' or 'epsilon_svr'.
% loss --- the loss function, string. Can have one of two values:
% 'logistic' (for classification) or 'absolute' (for regression).
% param_name1 --- learning rate, scalar.
% param_name2 --- parameter of base_algorithm. For 'regression_tree' it
% is a 'min_parent' --- min number of objects in the leaf of
% regression tree. For 'epsilon_svr' it is 'epsilon' parameter.
% param_name3 --- parameter, that exists only for 'epsilon_svr', it is a
% 'gamma' parameter.
% param_name4 --- parameter, that exists only for 'epsilon_svr', it is a
% 'C' parameter.
% param_value1, param_value2, param_value3, param_value4 --- values of
% corresponding parametres, scalar.
% OUTPUT:
% model --- trained composition, structure with three fields
% - b_0 --- the base of composition, scalar
% - weights --- double array of weights, 1 x num_iterations
% - models --- cell array with trained models, 1 x num_iterations
% - algorithm --- string, 'epsilon_svr' or 'regression_tree'
% - loss --- loss parameter (from INPUT).
%
% AUTHOR:
% Murat Apishev (great-mel@yandex.ru)
%
function [model] = gradient_boosting_train (X, y, num_iterations, base_algorithm, loss, ...
param_name1, param_value1, param_name2, param_value2, ...
param_name3, param_value3, param_name4, param_value4)
no_objects = size(X, 1);
if ~strcmp(base_algorithm, 'epsilon_svr') && ~strcmp(base_algorithm, 'regression_tree')
error('Incorrect type of algorithm!')
end
if strcmp(loss, 'logistic')
loss_function = @(a, b) log(1 + exp(-a .* b));
grad_a_loss_function = @(a, b) -b .* exp(-a .* b) ./ ((1 + exp(-a .* b)));
elseif strcmp(loss, 'absolute')
loss_function = @(a, b) abs(a - b);
grad_a_loss_function = @(a, b) -sign(b - a);
else
error('Incorrect type of loss function!');
end
func = @(c) sum(loss_function(y, c));
b_0 = fminsearch(func, 0);
model.b_0 = b_0;
model.algorithm = base_algorithm;
model.models = cell([1 num_iterations]);
model.loss = loss;
% the length of model is number of finite weights, not number of models!
model.weights = repmat(+Inf, 1, num_iterations);
z = zeros([no_objects 1]) + b_0;
delta = zeros([no_objects 1]);
for iter = 1 : num_iterations
for obj = 1 : no_objects
delta(obj) = -grad_a_loss_function(z(obj), y(obj));
end
if strcmp(base_algorithm, 'epsilon_svr')
model.models{iter} = svmtrain(delta, X, [' -s 3 -g ', num2str(param_value3), ...
' -c ', num2str(param_value4), ' -e ', num2str(param_value2)]);
value = svmpredict(y, X, model.models{iter});
elseif strcmp(base_algorithm, 'regression_tree')
model.models{iter} = RegressionTree.fit(X, delta, 'minparent', param_value2);
value = predict(model.models{iter}, X);
end
func = @(g) sum(loss_function(z + g * value, y));
model.weights(iter) = fminsearch(func, 0);
z = z + model.weights(iter) * value * param_value1;
end
end
没有合适的资源?快使用搜索试试~ 我知道了~
EEGClassification.zip
共32个文件
m:23个
mat:8个
readme:1个
需积分: 37 19 下载量 139 浏览量
2020-12-22
11:39:23
上传
评论 2
收藏 1.66MB ZIP 举报
温馨提示
实现脑电信号的情绪分类,1.特征提取方法:自回归(Autoregression,AR),公共空间模式(Common Spatial Pattern ,CSP),离散小波变换(Discrete Wavelet Transform,DWT)和功率谱密度( Power Spectral Density,PSD); 2. 特征分类方法:Bagging、Boosting、AdaBoost
资源详情
资源评论
资源推荐
收起资源包目录
EEGClassification-master.zip (32个子文件)
EEGClassification-master
Ensemble Learning
AdaBoost
predStump.m 240B
buildAdaBoost.m 1KB
app1.m 1KB
initStump.m 127B
buildOneDStump.m 959B
buildStump.m 310B
predAdaBoost.m 427B
initAdaBoost.m 239B
dataDWT.mat 2KB
Bagging
app1.m 1KB
bagging_predict.m 2KB
dataDWT.mat 2KB
bagging_train.m 3KB
Boosting
app1.m 1KB
gradient_boosting_predict.m 2KB
gradient_boosting_train.m 4KB
dataDWT.mat 2KB
Feature Extraction
dataPSD.mat 5KB
extractDWT.m 923B
app4.m 743B
extractAR.m 546B
dataAR.mat 5KB
app2.m 630B
app1.m 666B
extractPSD.m 588B
dataCSP.mat 5KB
dataset_BCIcomp1.mat 7.38MB
dataDWT.mat 5KB
app3.m 616B
learnCSP.m 2KB
extractCSP.m 2KB
README 587B
共 32 条
- 1
Hubert_xx
- 粉丝: 337
- 资源: 20
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功
评论0