[![Actions Status](https://github.com/wangronin/Bayesian-Optimization/workflows/Build%20and%20Test/badge.svg)](https://github.com/wangronin/Bayesian-Optimization/actions)
# Bayesian Optimization Library
A `Python` implementation of the Bayesian Optimization (BO) algorithm working on decision spaces composed of either real, integer, catergorical variables, or a mixture thereof.
Underpinned by surrogate models, BO iteratively proposes candidate solutions using the so-called **acquisition function** which balances exploration with exploitation, and updates the surrogate model with newly observed objective values. This algorithm is designed to optimize **expensive black-box** problems efficiently.
![](assets/BO-example.gif)
## Installation
You could either install the stable version on `pypi`:
```shell
pip install bayes-optim
```
Or, take the lastest version from github:
```shell
git clone https://github.com/wangronin/Bayesian-Optimization.git
cd Bayesian-Optimization && python setup.py install --user
```
## Example
For real-valued search variables, the simplest usage is via the `fmin` function:
```python
from bayes_optim import fmin
def f(x):
return sum(x ** 2)
minimum = fmin(f, [-5] * 2, [5] * 2, max_FEs=30, seed=42)
```
And you could also have much finer control over most ingredients of BO, e.g., the surrogate
model and acquisition functions. Please see the example below:
```python
from bayes_optim import BO, RealSpace
from bayes_optim.Surrogate import GaussianProcess
dim = 5
space = RealSpace([-5, 5]) * dim # create the search space
# hyperparameters of the GPR model
thetaL = 1e-10 * (ub - lb) * np.ones(dim)
thetaU = 10 * (ub - lb) * np.ones(dim)
model = GaussianProcess( # create the GPR model
thetaL=thetaL, thetaU=thetaU
)
opt = BO(
search_space=space,
obj_fun=fitness,
model=model,
DoE_size=5, # number of initial sample points
max_FEs=50, # maximal function evaluation
verbose=True
)
opt.run()
```
For more detailed usage and exmaples, please check out our [wiki page](https://github.com/wangronin/Bayesian-Optimization/wiki).
## Features
This implementation differs from alternative packages/libraries in the following features:
* **Parallelization**, also known as _batch-sequential optimization_, for which several different approaches are implemented here.
* **Moment-Generating Function of the improvment** (MGFI) [WvSEB17a] is a recently proposed acquistion function, which implictly controls the exploration-exploitation trade-off.
* **Mixed-Integer Evolution Strategy** for optimizing the acqusition function, which is enabled when the search space is a mixture of real, integer, and categorical variables.
## Project Structure
* `bayes-optim/SearchSpace.py`: implementation of the search/decision space.
* `bayes-optim/base.py`: the base class of Bayesian Optimization.
* `bayes-optim/AcquisitionFunction.py`: the implemetation of acquisition functions (see below for the list of implemented ones).
* `bayes-optim/Surrogate`: we implemented the Gaussian Process Regression (GPR) and Random Forest (RF).
* `bayes-optim/BayesOpt.py` contains several BO variants:
* `BO`: noiseless + sequential
* `ParallelBO`: noiseless + parallel (a.k.a. batch-sequential)
* `AnnealingBO`: noiseless + parallel + annealling [WEB18]
* `SelfAdaptiveBO`: noiseless + parallel + self-adaptive [WEB19]
* `NoisyBO`: noisy + parallel
* `bayes-optim/Extension.py` is meant to include the lastest developments that are not extensively tested:
* `PCABO`: noiseless + parallel + PCA-assisted dimensionality reduction [RaponiWBBD20] **[Under Construction]**
* `MultiAcquisitionBO`: noiseless + parallelization with multiple different acquisition functions **[Under Construction]**
<!-- * `optimizer/`: the optimization algorithm to maximize the infill-criteria, two algorithms are implemented:
1. **CMA-ES**: Covariance Martix Adaptation Evolution Strategy for _continuous_ optimization problems.
2. **MIES**: Mixed-Integer Evolution Strategy for mixed-integer/categorical optimization problems. -->
## Acquisition Functions
The following infill-criteria are implemented in the library:
* _Expected Improvement_ (EI)
* Probability of Improvement (PI) / Probability of Improvement
* _Upper Confidence Bound_ (UCB)
* _Moment-Generating Function of Improvement_ (MGFI)
* _Generalized Expected Improvement_ (GEI) **[Under Construction]**
For sequential working mode, Expected Improvement is used by default. For parallelization mode, MGFI is enabled by default.
## Surrogate Model
The meta (surrogate)-model used in Bayesian optimization. The basic requirement for such a model is to provide the uncertainty quantification (either empirical or theorerical) for the prediction. To easily handle the categorical data, __random forest__ model is used by default. The implementation here is based the one in _scikit-learn_, with modifications on uncertainty quantification.
## A brief Introduction to Bayesian Optimization
Bayesian Optimization [Moc74, JSW98] (BO) is a sequential optimization strategy originally proposed to solve the single-objective black-box optimiza-tion problem that is costly to evaluate. Here, we shall restrict our discussion to the single-objective case. BO typically starts with sampling an initial design of experiment (DoE) of size, X={x<sub>1</sub>,x<sub>2</sub>,...,x<sub>n</sub>}, which is usually generated by simple random sampling, Latin Hypercube Sampling [SWN03], or the more sophisticated low-discrepancy sequence [Nie88] (e.g., Sobol sequences). Taking the initial DoE X and its corresponding objective value, Y={f(x<sub>1</sub>), f(x<sub>2</sub>),..., f(x<sub>n</sub>)} ⊆ ℝ, we proceed to construct a statistical model M describing the probability distribution of the objective function conditioned onthe initial evidence, namely Pr(f|X,Y). In most application scenarios of BO, there is a lack of a priori knowledge about f and therefore nonparametric models (e.g., Gaussian process regression or random forest) are commonly chosen for M, which gives rise to a predictor f'(x) for all x ∈ X and an uncertainty quantification s'(x) that estimates, for instance, the mean squared error of the predic-tion E(f'(x)−f(x))<sup>2</sup>. Based on f' and s', promising points can be identified via the so-called acquisition function which balances exploitation with exploration of the optimization process.
<!-- Bayesian optimization is __sequential design strategy__ that does not require the derivatives of the objective function and is designed to solve expensive global optimization problems. Compared to alternative optimization algorithms (or other design of experiment methods), the very distinctive feature of this method is the usage of a __posterior distribution__ over the (partially) unknown objective function, which is obtained via __Bayesian inference__. This optimization framework is proposed by Jonas Mockus and Antanas Zilinskas, et al.
Formally, the goal is to approach the global optimum, using a sequence of variables:
$$\mathbf{x}_1,\mathbf{x}_2, \ldots, \mathbf{x}_n \in S \subseteq \mathbb{R}^d,$$
which resembles the search sequence in stochastic hill-climbing, simulated annealing and (1+1)-strategies. The only difference is that such a sequence is __not__ necessarily random and it is actually deterministic (in principle) for Bayesian optimization. In order to approach the global optimum, this algorithm iteratively seeks for an optimal choice as the next candidate variable adn the choice can be considered as a decision function:
\[\mathbf{x}_{n+1} = d_n\left(\{\mathbf{x}_i\}_{i=1}^n, \{y_i\}_{i=1}^n \right), \quad y_i = f(\mathbf{x}_i) + \varepsilon,\]
meaning that it takes the history of the optimization in order to make a decision. The quality of a decision can be measured by the following loss function that is the optimality error or optimality gap:
$$\epsilon
没有合适的资源?快使用搜索试试~ 我知道了~
温馨提示
资源分类:Python库 所属语言:Python 资源全名:bayes-optim-0.2.0.tar.gz 资源来源:官方 安装方法:https://lanzao.blog.csdn.net/article/details/101784059
资源推荐
资源详情
资源评论
收起资源包目录
bayes-optim-0.2.0.tar.gz (43个子文件)
bayes-optim-0.2.0
PKG-INFO 15KB
LICENSE 1KB
bayes_optim.egg-info
PKG-INFO 15KB
requires.txt 151B
SOURCES.txt 2KB
top_level.txt 12B
dependency_links.txt 1B
setup.cfg 38B
bayes_optim
mobo.py 12KB
Solution.py 14KB
extra
multi_objective
__init__.py 132B
hypervolume.py 10KB
pareto.py 1KB
box_decompositions
non_dominated.py 18KB
utils.py 12KB
box_decomposition.py 11KB
__init__.py 359B
__init__.py 0B
acquisition_fun.py 9KB
utils.py 7KB
misc.py 6KB
acquisition_optim
option.py 335B
__init__.py 4KB
one_plus_one_cma_es.py 17KB
mies.py 13KB
Extension.py 5KB
Surrogate
random_forest.py 6KB
__init__.py 195B
gaussian_process
cma_es.py 23KB
gpr.py 48KB
trend.py 6KB
utils.py 6KB
boundary_handling.py 997B
__init__.py 199B
kernel.py 13KB
multi_objective
analytic.py 10KB
__init__.py 47B
__init__.py 7KB
bayes_opt.py 7KB
search_space.py 35KB
base.py 25KB
setup.py 1KB
README.md 13KB
共 43 条
- 1
资源评论
挣扎的蓝藻
- 粉丝: 14w+
- 资源: 15万+
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功