# Bayesian Adaptive Direct Search (BADS) - v1.0.8
**News:**
- If you are interested in Bayesian model fitting, check out [Variational Bayesian Monte Carlo (VBMC)](https://github.com/lacerbi/vbmc), a simple and user-friendly toolbox for Bayesian posterior and model inference that we published at NeurIPS (2018, 2020).
- The BADS paper [[1](#reference)] has been accepted for a poster presentation at [NeurIPS 2017](https://papers.nips.cc/paper/6780-practical-bayesian-optimization-for-model-fitting-with-bayesian-adaptive-direct-search)! (20.9% acceptance rate this year, for a total of 3240 submissions)
- BADS has also been presented at the NeurIPS workshop on Bayesian optimization for science and engineering, [BayesOpt 2017](https://bayesopt.github.io/).
## What is it
BADS is a novel, fast Bayesian optimization algorithm designed to solve difficult optimization problems, in particular related to fitting computational models (e.g., via [maximum likelihood estimation](https://en.wikipedia.org/wiki/Maximum_likelihood_estimation)).
BADS has been intensively tested for fitting behavioral, cognitive, and neural models, and is currently being used in many computational labs around the world.
In our benchmark with real model-fitting problems, BADS performed on par or better than many other common and state-of-the-art MATLAB optimizers, such as `fminsearch`, `fmincon`, and `cmaes` [[1](#reference)].
BADS is recommended when no gradient information is available, and the objective function is non-analytical or *noisy*, for example evaluated through numerical approximation or via simulation.
BADS requires no specific tuning and runs off-the-shelf like other built-in MATLAB optimizers such as `fminsearch`.
If you are interested in estimating posterior distributions (i.e., uncertainty and error bars) over parameters, and not just point estimates, you might want to check out [Variational Bayesian Monte Carlo](https://github.com/lacerbi/vbmc), a toolbox for Bayesian posterior and model inference which can be used in synergy with BADS.
## Installation
[**Download the latest version of BADS as a ZIP file**](https://github.com/lacerbi/bads/archive/master.zip).
- To install BADS, clone or unpack the zipped repository where you want it and run the script `install.m`.
- This will add the BADS base folder to the MATLAB search path.
- To see if everything works, run `bads('test')`.
## Quick start
The BADS interface is similar to that of other MATLAB optimizers. The basic usage is:
```matlab
[X,FVAL] = bads(FUN,X0,LB,UB,PLB,PUB);
```
with input parameters:
- `FUN`, a function handle to the objective function to minimize (typically, the *negative* log likelihood of a dataset and model, for a given input parameter vector);
- `X0`, the starting point of the optimization (a row vector);
- `LB` and `UB`, hard lower and upper bounds;
- `PLB` and `PUB`, *plausible* lower and upper bounds, that is a box where you would expect to find almost all solutions.
The output parameters are:
- `X`, the found optimum.
- `FVAL`, the (estimated) function value at the optimum.
For more usage examples, see [**bads_examples.m**](https://github.com/lacerbi/bads/blob/master/bads_examples.m). You can also type `help bads` to display the documentation.
For practical recommendations, such as how to set `LB` and `UB`, and any other question, check out the FAQ on the [BADS wiki](https://github.com/lacerbi/bads/wiki).
*Note*: BADS is a *semi-local* optimization algorithm, in that it can escape local minima better than many other methods — but it can still get stuck. The best performance for BADS is obtained by running the algorithm multiple times from distinct starting points (see [here](https://github.com/lacerbi/bads/wiki#how-do-i-choose-the-starting-point-x0)).
## How does it work
BADS follows a [mesh adaptive direct search](http://epubs.siam.org/doi/abs/10.1137/040603371) (MADS) procedure for function minimization that alternates **poll** steps and **search** steps (see **Fig 1**).
- In the **poll** stage, points are evaluated on a mesh by taking steps in one direction at a time, until an improvement is found or all directions have been tried. The step size is doubled in case of success, halved otherwise.
- In the **search** stage, a [Gaussian process](https://en.wikipedia.org/wiki/Gaussian_process) (GP) is fit to a (local) subset of the points evaluated so far. Then, we iteratively choose points to evaluate according to a *lower confidence bound* strategy that trades off between exploration of uncertain regions (high GP uncertainty) and exploitation of promising solutions (low GP mean).
**Fig 1: BADS procedure** ![BADS procedure](https://github.com/lacerbi/bads/blob/master/docs/bads-cartoon.png "Fig 1: BADS procedure")
See [here](https://github.com/lacerbi/optimviz) for a visualization of several optimizers at work, including BADS.
See our paper for more details [[1](#reference)].
## Troubleshooting
If you have trouble doing something with BADS:
- Check out the FAQ on the [BADS wiki](https://github.com/lacerbi/bads/wiki);
- Contact me at <luigi.acerbi@helsinki.fi>, putting 'BADS' in the subject of the email.
This project is under active development. If you find a bug, or anything that needs correction, please let me know.
## BADS for other programming languages
BADS is currently available only for MATLAB. A Python version is being planned.
If you are interested in porting BADS to Python or another language (R, [Julia](https://julialang.org/)), please get in touch at <luigi.acerbi@helsinki.fi> (putting 'BADS' in the subject of the email); I'd be willing to help.
However, before contacting me for this reason, please have a good look at the codebase here on GitHub, and at the paper [[1](#reference)]. BADS is a fairly complex piece of software, so be aware that porting it will require considerable effort and programming skills.
## Reference
1. Acerbi, L. & Ma, W. J. (2017). Practical Bayesian Optimization for Model Fitting with Bayesian Adaptive Direct Search. In *Advances in Neural Information Processing Systems 30*, pages 1834-1844. ([link](https://papers.nips.cc/paper/6780-practical-bayesian-optimization-for-model-fitting-with-bayesian-adaptive-direct-search), [arXiv preprint](https://arxiv.org/abs/1705.04405))
You can cite BADS in your work with something along the lines of
> We optimized the log likelihoods of our models using Bayesian adaptive direct search (BADS; Acerbi and Ma, 2017). BADS alternates between a series of fast, local Bayesian optimization steps and a systematic, slower exploration of a mesh grid.
Besides formal citations, you can demonstrate your appreciation for BADS in the following ways:
- *Star* the BADS repository on GitHub;
- [Follow me on Twitter](https://twitter.com/AcerbiLuigi) for updates about BADS and other projects I am involved with;
- Tell me about your model-fitting problem and your experience with BADS (positive or negative) at <luigi.acerbi@helsinki.fi> (putting 'BADS' in the subject of the email).
### License
BADS is released under the terms of the [GNU General Public License v3.0](https://github.com/lacerbi/bads/blob/master/LICENSE.txt).
没有合适的资源?快使用搜索试试~ 我知道了~
模型拟合的贝叶斯自适应直接搜索 (BADS) 优化算法附matlab代码.zip
共324个文件
m:262个
md:9个
cpp:8个
1.该资源内容由用户上传,如若侵权请联系客服进行举报
2.虚拟产品一经售出概不退款(资源遇到问题,请及时私信上传者)
2.虚拟产品一经售出概不退款(资源遇到问题,请及时私信上传者)
版权申诉
0 下载量 132 浏览量
2023-04-22
09:53:52
上传
评论
收藏 2.99MB ZIP 举报
温馨提示
1.版本:matlab2014/2019a,内含运行结果,不会运行可私信 2.领域:智能优化算法、神经网络预测、信号处理、元胞自动机、图像处理、路径规划、无人机等多种领域的Matlab仿真,更多内容可点击博主头像 3.内容:标题所示,对于介绍可点击主页搜索博客 4.适合人群:本科,硕士等教研学习使用 5.博客介绍:热爱科研的Matlab仿真开发者,修心和技术同步精进,matlab项目合作可si信
资源推荐
资源详情
资源评论
收起资源包目录
模型拟合的贝叶斯自适应直接搜索 (BADS) 优化算法附matlab代码.zip (324个子文件)
solve_chol.c 2KB
changelog 330B
Copyright 2KB
matlabprogram.cpp 7KB
lbfgsb.cpp 4KB
program.cpp 4KB
arrayofmatrices.cpp 2KB
matlabmatrix.cpp 2KB
matlabstring.cpp 1KB
matlabscalar.cpp 984B
matlabexception.cpp 512B
style.css 77B
solver_2_4.f 138KB
solver_3_0.f 127KB
linpack.f 6KB
blas.f 6KB
timer.f 1KB
f7.gif 30KB
f6.gif 28KB
f5.gif 19KB
f2.gif 15KB
f8.gif 14KB
f3.gif 14KB
f4.gif 13KB
f1.gif 5KB
.gitignore 46B
program.h 5KB
array.h 4KB
matlabprogram.h 3KB
matlabmatrix.h 2KB
arrayofmatrices.h 2KB
matlabscalar.h 1KB
matlabstring.h 1KB
matlabexception.h 869B
index.html 57KB
LICENSE 32KB
bads.m 67KB
fminbayes.m 27KB
gpHyperOptimize.m 20KB
gpTrainingSet.m 20KB
minimizebnd.m 18KB
infGrid_Laplace.m 13KB
gpdefStationaryNew.m 13KB
minimize_v2.m 12KB
infFITC_EP.m 12KB
gpdefBads.m 11KB
gpminimize.m 11KB
infFITC_Laplace.m 11KB
minimize_v1.m 11KB
minimize.m 11KB
exact_inference_fast.m 11KB
exact_inference_robust.m 10KB
infMCMC.m 10KB
mygp.m 10KB
bads_examples.m 10KB
infKL.m 10KB
swtest.m 10KB
covGrid.m 10KB
gp.m 10KB
likGaussWarp.m 9KB
likSech2.m 8KB
transvars.m 8KB
likMix.m 8KB
infLaplace.m 8KB
infGrid.m 8KB
gpHyperSVGD.m 8KB
setupvars.m 7KB
i4_sobol.m 7KB
likLaplace.m 7KB
covPPERard_fast.m 6KB
infPrior_fast.m 6KB
funlogger.m 6KB
searchES.m 6KB
infVB.m 6KB
likLogistic.m 6KB
infEP.m 6KB
covFunctions.m 6KB
searchSeries.m 6KB
infPrior.m 6KB
likGaussWarpExact.m 6KB
infFITC.m 5KB
updatehess.m 5KB
demoRegression.m 5KB
gpHyperSample.m 5KB
setupoptions.m 5KB
binaryGP.m 5KB
likBeta.m 5KB
infExact_fastrobust.m 5KB
covSM.m 5KB
likT.m 5KB
likNegBinom.m 5KB
demoClassification.m 5KB
likInvGauss.m 5KB
infExact_fast.m 5KB
likGamma.m 4KB
likWeibull.m 4KB
runtest.m 4KB
evalinitmesh.m 4KB
covPPERard.m 4KB
minimize_lbfgsb.m 4KB
共 324 条
- 1
- 2
- 3
- 4
资源评论
Matlab科研辅导帮
- 粉丝: 3w+
- 资源: 7803
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功