# SGDLibrary : Stochastic Optimization Algorithm Library in MATLAB/Octave
----------
Authors: [Hiroyuki Kasai](http://kasai.comm.waseda.ac.jp/kasai/)
Last page update: November 20, 2020
Latest library version: 1.0.20 (see Release notes for more info)
<br />
Announcement
----------
We are very welcome to your contribution. Please tell us
- Stochastic optimization solvers written by MATLAB, and
- your comments and suggestions.
<br />
Introduction
----------
The SGDLibrary is a **pure-MATLAB** library or toolbox of a collection of **stochastic optimization algorithms**. This solves an unconstrained minimization problem of the form, min f(x) = sum_i f_i(x).
The SGDLibrary is also operable on [GNU Octave](https://www.gnu.org/software/octave/) (Free software compatible with many MATLAB scripts).
Note that this SGDLibrary internally contains the [GDLibrary](https://github.com/hiroyuki-kasai/GDLibrary).
<br />
Document
----------
The document of SGDLibrary can be obtained from below;
- H. Kasai, "[SGDLibrary: A MATLAB library for stochastic optimization algorithms](http://www.jmlr.org/papers/v18/17-632.html)," Journal of Machine Learning Research (JMLR), vol.18, no.215, 2018 (arXiv preprint [arXiv:1710.10951](https://arxiv.org/abs/1710.10951)).
<br />
## <a name="supp_solver"> List of the algorithms available in SGDLibrary </a>
- **SGD variants** (stochastic gradient descent)
- Vanila SGD
- H. Robbins and S. Monro, "[A stochastic approximation method](https://www.jstor.org/stable/pdf/2236626.pdf)," The annals of mathematical statistics, vol. 22, no. 3, pp. 400-407, 1951.
- L. Bottou, "[Online learning and stochastic approximations](http://leon.bottou.org/publications/pdf/online-1998.pdf)," Edited by David Saad, Cambridge University Press, Cambridge, UK, 1998.
- SGD-CM (SGD with classical momentum)
- SGD-CM-NAG (SGD with classical momentum and Nesterov's Accelerated Gradient)
- I. Sutskever, J. Martens, G. Dahl and G. Hinton, "[On the importance of initialization and momentum in deep learning](https://dl.acm.org/citation.cfm?id=3043064)," ICML, 2013.
- AdaGrad (Adaptive gradient algorithm)
- J. Duchi, E. Hazan and Y. Singer, "[Adaptive subgradient methods for online learning and stochastic optimization](http://www.jmlr.org/papers/volume12/duchi11a/duchi11a.pdf)," Journal of Machine Learning Research, 12, pp. 2121-2159, 2011.
- AdaDelta
- M. D.Zeiler, "[AdaDelta: An adaptive learning rate method](http://arxiv.org/abs/1212.5701)," arXiv preprint arXiv:1212.5701, 2012.
- RMSProp
- T. Tieleman and G. Hinton, "Lecture 6.5 - RMSProp", COURSERA: Neural Networks for Machine Learning, Technical report, 2012.
- Adam
- D. Kingma and J. Ba, "[Adam: A method for stochastic optimization](http://arxiv.org/pdf/1412.6980.pdf)," International Conference for Learning Representation (ICLR), 2015.
- AdaMax
- D. Kingma and J. Ba, "[Adam: A method for stochastic optimization](http://arxiv.org/pdf/1412.6980.pdf)," International Conference for Learning Representation (ICLR), 2015.
- **Variance reduction variants**
- SVRG (stochastic variance reduced gradient)
- R. Johnson and T. Zhang, "[Accelerating stochastic gradient descent using predictive variance reduction](http://papers.nips.cc/paper/4937-accelerating-stochastic-gradient-descent-using-predictive-variance-reduction.pdf)," NIPS, 2013.
- SAG (stochastic average gradient)
- N. L. Roux, M. Schmidt, and F. R. Bach, "[A stochastic gradient method with an exponential convergence rate for finite training sets](https://papers.nips.cc/paper/4633-a-stochastic-gradient-method-with-an-exponential-convergence-_rate-for-finite-training-sets.pdf)," NIPS, 2012.
- SAGA
- A. Defazio, F. Bach, and S. Lacoste-Julien, "[SAGA: A fast incremental gradient method with support for non-strongly convex composite objectives](https://papers.nips.cc/paper/5258-saga-a-fast-incremental-gradient-method-with-support-for-non-strongly-convex-composite-objectives.pdf),", NIPS, 2014.
- SARAH (StochAstic Recusive gRadient algoritHm)
- L. M. Nguyen, J. Liu, K. Scheinberg, and M. Takac, "[SARAH: A novel method for machine learning problems using stochastic recursive gradient](https://arxiv.org/abs/1703.00102)," ICML, 2017.
- **Quasi-Newton variants**
- SQN (stochastic quasi-Newton)
- R. H. Byrd, ,S. L. Hansen J. Nocedal, and Y. Singer, "[A stochastic quasi-Newton method
for large-scale optimization](http://epubs.siam.org/doi/abs/10.1137/140954362?journalCode=sjope8)," SIAM Journal on Optimization, vol. 26, Issue 2, pp. 1008-1031, 2016.
- SVRG-SQN (denoted as "Stochastic L-BFGS" or "slbfgs" in the paper below.)
- P. Moritz, R. Nishihara and M. I. Jordan, "[A linearly-convergent stochastic L-BFGS Algorithm](http://www.jmlr.org/proceedings/papers/v51/moritz16.html)," International Conference on Artificial Intelligence and Statistics (AISTATS), pp.249-258, 2016.
- SVRG-LBFGS (denoted as "SVRG+II: LBFGS" in the paper below.)
- R. Kolte, M. Erdogdu and A. Ozgur, "[Accelerating SVRG via second-order information](http://www.opt-ml.org/papers/OPT2015_paper_41.pdf)," OPT2015, 2015.
- SS-SVRG (denoted as "SVRG+I: Subsampled Hessian followed by SVT" in the paper below.)
- R. Kolte, M. Erdogdu and A. Ozgur, "[Accelerating SVRG via second-order information](http://www.opt-ml.org/papers/OPT2015_paper_41.pdf)," OPT2015, 2015.
- oBFGS-Inf (Online BFGS, Infinite memory)
- N. N. Schraudolph, J. Yu and Simon Gunter, "[A stochastic quasi-Newton method for online convex optimization
](http://www.jmlr.org/proceedings/papers/v2/schraudolph07a/schraudolph07a.pdf),"
International Conference on Artificial Intelligence and Statistics (AISTATS), pp.436-443, Journal of Machine Learning Research, 2007.
- oLBFGS-Lim (Online BFGS, Limited memory)
- N. N. Schraudolph, J. Yu and S. Gunter, "[A stochastic quasi-Newton method for online convex optimization
](http://www.jmlr.org/proceedings/papers/v2/schraudolph07a/schraudolph07a.pdf),"
International Conference on Artificial Intelligence and Statistics (AISTATS), pp.436-443, Journal of Machine Learning Research, 2007.
- A. Mokhtari and A. Ribeiro, "[Global convergence of online limited memory BFGS](www.jmlr.org/papers/volume16/mokhtari15a/mokhtari15a.pdf )," Journal of Machine Learning Research, 16, pp. 3151-3181, 2015.
- Reg-oBFGS-Inf (Regularized oBFGS, Infinite memory) (denoted as "RES" in the paper below.)
- A. Mokhtari and A. Ribeiro, "[RES: Regularized stochastic BFGS algorithm](http://ieeexplore.ieee.org/document/6899692/)," IEEE Transactions on Signal Processing, vol. 62, no. 23, pp. 6089-6104, Dec., 2014.
- Damp-oBFGS-Inf (Regularized damped oBFGS, Infinite memory) (denoted as "SDBFGS" in the paper below.)
- X. Wang, S. Ma, D. Goldfarb and W. Liu, "[Stochastic quasi-Newton methods for nonconvex stochastic
optimization](https://arxiv.org/pdf/1607.01231v3.pdf)," arXiv preprint arXiv:1607.01231, 2016.
- IQN (incremental Quasi-Newton method)
- A. Mokhtari, M. Eisen and A. Ribeiro, "[An Incremental Quasi-Newton Method with a Local Superlinear Convergence Rate](https://arxiv.org/abs/1702.00709)," ICASSP2017, 2017.
- **Inexact Hessian variants**
- SCR (Sub-sampled Cubic Regularization)
- J. M. Kohler and A. Lucchi, "[Sub-sampled Cubic Regularization for non-convex optimization](http://proceedings.mlr.press/v70/kohler17a.html)," ICML, 2017.
- Sub-sampled TR (trust region)
- A. R. Conn, N. I. Gould and P. L. Toint, "[Trust region methods](https://epubs.siam.org/doi/book/10.1137/1.9780898719857)," MOS-SIAM Series on Optimization, 2000.
- **Else**
- SVRG-BB (stochastic variance reduced gradient with Barzilai-Borwein)
- C. Tan, S. Ma, Y. Dai, Y. Qian, "[Barzilai-Borwein step size for stochastic gradient descent](https://arxiv.org/pdf/1605.04131
没有合适的资源?快使用搜索试试~ 我知道了~
MATLABOctave library for stochastic optimization algorithms
共190个文件
m:165个
mat:6个
c:5个
1.该资源内容由用户上传,如若侵权请联系客服进行举报
2.虚拟产品一经售出概不退款(资源遇到问题,请及时私信上传者)
2.虚拟产品一经售出概不退款(资源遇到问题,请及时私信上传者)
版权申诉
0 下载量 105 浏览量
2023-07-23
10:31:38
上传
评论
收藏 31.35MB ZIP 举报
温馨提示
MATLABOctave library for stochastic optimization algorithms Version 1.0.20.zip
资源推荐
资源详情
资源评论
收起资源包目录
MATLABOctave library for stochastic optimization algorithms (190个子文件)
svmtrain.c 12KB
svmpredict.c 10KB
svm_model_matlab.c 8KB
libsvmread.c 4KB
libsvmwrite.c 2KB
COPYRIGHT 1KB
.gitattributes 592B
.gitignore 84B
svm_model_matlab.h 201B
FAQ.html 81KB
tr_subsolver.m 16KB
test_l1_linear_regression.m 15KB
plotregion.m 15KB
test_l1_logistic_regression.m 14KB
test_linear_svm.m 13KB
test_convergence_animation_demo.m 13KB
test_logistic_regression.m 13KB
test_linear_regression.m 13KB
test_softmax_classifier.m 13KB
tr.m 13KB
cr_subsolver.m 12KB
test_general.m 11KB
test_linear_regression.m 11KB
scr.m 11KB
test_linear_svm.m 11KB
ag.m 11KB
test_logistic_regression.m 10KB
display_graph.m 10KB
bb_sgd_test.m 10KB
test_softmax_classifier.m 10KB
subsamp_tr.m 9KB
binary_linear_classification.m 9KB
test_rosenbrock.m 9KB
logistic_regression.m 8KB
test_quadratic_sample.m 8KB
obfgs.m 8KB
bfgs.m 8KB
gfigure.m 8KB
test_quadratic.m 8KB
lasso.m 8KB
subsamp_newton.m 8KB
test_rosenbrock_sample.m 8KB
bb_sgd.m 8KB
draw_convergence_animation.m 8KB
slbfgs.m 8KB
display_classification_result.m 7KB
test_lasso.m 7KB
lbfgs.m 7KB
logistic_regression2.m 7KB
test_sampled_newton.m 6KB
cg.m 6KB
l1_linear_regression.m 6KB
l1_logistic_regression.m 5KB
bb.m 5KB
adagrad.m 5KB
test_l1_logistic_regression.m 5KB
softmax_regression.m 5KB
admm_lasso.m 5KB
sarah.m 5KB
test_sum_quadratic.m 5KB
newton.m 5KB
subsamp_svrg.m 5KB
linesearch_alg.m 5KB
linear_regression.m 5KB
adam.m 5KB
linear_svm.m 5KB
cd_lasso_elasticnet.m 5KB
matrix_completion.m 4KB
subg.m 4KB
ncg.m 4KB
test_one_dim_denosing.m 4KB
test_elasticnet.m 4KB
store_infos.m 4KB
draw_convergence_sequence.m 4KB
svrg.m 4KB
hb.m 4KB
sd.m 4KB
svrg_bb.m 4KB
display_regression_result.m 4KB
general.m 4KB
iqn.m 4KB
one_dim_tv_denoise_problem.m 4KB
sgd_cm.m 4KB
sag.m 4KB
test_one_dim_tv_denosing.m 3KB
ista.m 3KB
strong_wolfe_line_search.m 3KB
test_trace_norm_mc.m 3KB
elastic_net.m 3KB
demo_paper.m 3KB
generate_synthetic_tensor.m 3KB
proj_spectahedron_box.m 3KB
sum_quadratic.m 3KB
trace_norm_matrix_completion.m 3KB
sgd.m 3KB
test_lasso_cv.m 3KB
demo_ext.m 3KB
proj_l1ball_box.m 3KB
proj_spectahedron.m 3KB
bisection.m 3KB
共 190 条
- 1
- 2
资源评论
AbelZ_01
- 粉丝: 885
- 资源: 5441
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功