# GaussianProcesses.jl
[![Build Status](https://travis-ci.org/STOR-i/GaussianProcesses.jl.png)](https://travis-ci.org/STOR-i/GaussianProcesses.jl)
[![Build status](https://ci.appveyor.com/api/projects/status/github/STOR-i/GaussianProcesses.jl?branch=master&svg=true)](https://ci.appveyor.com/project/STOR-i/gaussianprocesses-jl)
[![Coverage Status](https://coveralls.io/repos/github/STOR-i/GaussianProcesses.jl/badge.svg?branch=master)](https://coveralls.io/github/STOR-i/GaussianProcesses.jl?branch=master)
[![codecov](https://codecov.io/gh/STOR-i/GaussianProcesses.jl/branch/master/graph/badge.svg)](https://codecov.io/gh/STOR-i/GaussianProcesses.jl)
[![](https://img.shields.io/badge/docs-stable-blue.svg)](http://STOR-i.github.io/GaussianProcesses.jl/latest)
A Gaussian Processes package for Julia.
If you have any suggestions to improve the package, or if you've noticed a bug, then please post an [issue](https://github.com/STOR-i/GaussianProcesses.jl/issues/new) for us and we'll get to it as quickly as we can. Pull requests are also welcome.
## Citing GaussianProcesses.jl
To cite GaussianProcesses.jl, please reference the [paper](http://statistik-jstat.uibk.ac.at/article/view/v102i01). Sample Bibtex is given below:
```bibtex
@article{fairbrother2022gaussianprocesses,
title={GaussianProcesses. jl: A Nonparametric Bayes Package for the Julia Language},
author={Fairbrother, Jamie and Nemeth, Christopher and Rischard, Maxime and Brea, Johanni and Pinder, Thomas},
journal={Journal of Statistical Software},
volume={102},
pages={1--36},
year={2022}
}
```
## Introduction
Gaussian processes are a family of stochastic processes which provide a flexible nonparametric tool for modelling data. A Gaussian Process places a prior over functions, and can be described as an infinite dimensional generalisation of a multivariate Normal distribution. Moreover, the joint distribution of any finite collection of points is a multivariate Normal. This process can be fully characterised by its mean and covariance functions, where the mean of any point in the process is described by the *mean function* and the covariance between any two observations is specified by the *kernel*. Given a set of observed real-valued points over a space, the Gaussian Process is used to make inference on the values at the remaining points in the space.
For an extensive review of Gaussian Processes there is an excellent book [Gaussian Processes for Machine Learning](http://www.gaussianprocess.org/gpml/chapters/RW.pdf) by Rasmussen and Williams, (2006).
## Installation
GaussianProcesses.jl requires Julia version 0.7 or above. To install GaussianProcesses.jl run the following command inside a Julia session:
```julia
julia> using Pkg
julia> Pkg.add("GaussianProcesses")
```
## Functionality
The package allows the user to fit exact **Gaussian process** models when the observations are Gaussian distributed about the latent function. In the case where the *observations are non-Gaussian*, the posterior distribution of the latent function is intractable. The package allows for Monte Carlo sampling from the posterior.
The main function of the package is `GP`, which fits the Gaussian process
```julia
gp = GP(x, y, mean, kernel)
gp = GP(x, y, mean, kernel, likelihood)
```
for Gaussian and non-Gaussian data respectively.
The package has a number of *mean*, *kernel* and *likelihood* functions available. See the documentation for further details.
### Inference
The parameters of the model can be estimated by maximizing the log-likelihood (where the latent function is integrated out) using the `optimize!` function, or in the case of *non-Gaussian data*, an `mcmc` function is available, utilizing the Hamiltonian Monte Carlo sampler. In addition to a HMC sampler, it is possible to sample from the posterior using an elliptical slice sampler, provided that the data exhibits a Gaussian likelihood. Finally, for fast, yet approximate inference in the case of Poisson data, a variational approximation can be used to infer the model parameters and latent function values.
```julia
optimize!(gp) # Find parameters which maximize the log-likelihood
mcmc(gp) # Sample from the GP posterior
ess(gp) # Sample from the GP posterior using an elliptical slice sampler
vi(gp) # Create a variational approximation
```
See the [notebooks](https://github.com/STOR-i/GaussianProcesses.jl/tree/master/notebooks) for examples of the functions used in the package.
## Documentation
Documentation is accessible in the Julia REPL in help mode. Help mode can be started by typing '?' at the prompt.
```
julia> ?GP
search: GP GPE GPMC GPBase gperm log1p getpid getproperty MissingException
GP(x, y, mean::Mean, kernel::Kernel[, logNoise::Float64=-2.0])
Fit a Gaussian process that is defined by its mean, its kernel, and the
logarithm logNoise of the standard deviation of its observation noise to a
set of training points x and y.
See also: GPE
────────────────────────────────────────────────────────────────────────────
GP(x, y, mean::Mean, kernel::Kernel, lik::Likelihood)
Fit a Gaussian process that is defined by its mean, its kernel, and its
likelihood function lik to a set of training points x and y.
See also: GPA
```
Alternatively, [online documentation](http://stor-i.github.io/GaussianProcesses.jl/latest/index.html) and is under development.
## Notebooks
Sample code is available from the [notebooks](https://github.com/STOR-i/GaussianProcesses.jl/tree/master/notebooks).
## Related packages
[GeoStats](https://github.com/juliohm/GeoStats.jl) - High-performance implementations of geostatistical algorithms for the Julia programming language. This package is in its initial development, and currently only contains Kriging estimation methods. More features will be added as the Julia type system matures.
## ScikitLearn
This package also supports the [ScikitLearn](https://github.com/cstjean/ScikitLearn.jl) interface. ScikitLearn provides many tools for machine learning such as hyperparameter tuning and cross-validation. See [here](https://github.com/cstjean/ScikitLearn.jl/blob/master/examples/Gaussian_Processes_Julia.ipynb) for an example of its usage with this package.
没有合适的资源?快使用搜索试试~ 我知道了~
用于高斯过程的Julia包_Jupyter Notebook_Julia_下载.zip
共155个文件
jl:87个
png:21个
md:18个
1.该资源内容由用户上传,如若侵权请联系客服进行举报
2.虚拟产品一经售出概不退款(资源遇到问题,请及时私信上传者)
2.虚拟产品一经售出概不退款(资源遇到问题,请及时私信上传者)
版权申诉
0 下载量 191 浏览量
2023-04-26
10:55:39
上传
评论
收藏 3.81MB ZIP 举报
温馨提示
用于高斯过程的Julia包_Jupyter Notebook_Julia_下载.zip
资源推荐
资源详情
资源评论
收起资源包目录
用于高斯过程的Julia包_Jupyter Notebook_Julia_下载.zip (155个子文件)
simdata.csv 633KB
CO2_data.csv 15KB
coal.csv 784B
compare.csv 578B
GPy.csv 293B
GaussianProcesses_jl.csv 269B
gpml.csv 263B
.gitignore 99B
Sparse Approximations.ipynb 976KB
Regression.ipynb 477KB
Plotting GPs.ipynb 351KB
Classification.ipynb 317KB
Poisson regression.ipynb 123KB
Mauna_Loa_time_series.ipynb 73KB
Regression with outliers.ipynb 62KB
benchmark_julia.ipynb 14KB
benchmark python.ipynb 6KB
benchmark GPflow.ipynb 2KB
GPE.jl 20KB
full_scale_approximation.jl 18KB
GPA.jl 13KB
subsetofregressors.jl 13KB
fully_indep_train_conditional.jl 13KB
crossvalidation.jl 11KB
kernels.jl 9KB
GPEelastic.jl 7KB
optim.jl 7KB
test_sparse.jl 6KB
kernels.jl 6KB
mcmc.jl 6KB
GP.jl 6KB
test_crossvalidation.jl 5KB
distance.jl 5KB
common.jl 5KB
vi.jl 4KB
memory.jl 4KB
masked_kernel.jl 4KB
kernels.jl 4KB
optimize.jl 4KB
prod_kernel.jl 3KB
stationary.jl 3KB
test_vi.jl 3KB
ScikitLearn.jl 3KB
autodiff.jl 3KB
sum_kernel.jl 3KB
pair_kernel.jl 3KB
lin_ard.jl 3KB
fixed_kernel.jl 3KB
poly.jl 3KB
gaussian.jl 3KB
determ_train_conditional.jl 2KB
heteroscedastic.jl 2KB
rq_ard.jl 2KB
lin_iso.jl 2KB
utils.jl 2KB
mat.jl 2KB
benchmark_julia.jl 2KB
gp.jl 2KB
studentT.jl 2KB
mPeriodic.jl 2KB
GaussianProcesses.jl 2KB
rq_iso.jl 2KB
elastic.jl 2KB
results_table.jl 2KB
periodic.jl 2KB
se_ard.jl 2KB
plot.jl 2KB
poisson.jl 2KB
means.jl 2KB
noise.jl 2KB
gpa.jl 1KB
mat52_ard.jl 1KB
mPoly.jl 1KB
mat32_ard.jl 1KB
mat12_ard.jl 1KB
se_iso.jl 1KB
mat52_iso.jl 1KB
mat32_iso.jl 1KB
perf.jl 1KB
mat12_iso.jl 1KB
chol_utils.jl 1KB
binomial.jl 1KB
update_perf.jl 1KB
const.jl 1KB
likelihoods.jl 1KB
bernoulli.jl 1KB
mcmc.jl 1KB
means.jl 1KB
mLin.jl 946B
prod_mean.jl 920B
mConst.jl 910B
composite_kernel.jl 907B
exponential.jl 826B
sparseGP.jl 760B
sum_mean.jl 746B
make.jl 650B
test_exps.jl 613B
lin.jl 603B
sparsekerneldata.jl 599B
mZero.jl 526B
共 155 条
- 1
- 2
资源评论
快撑死的鱼
- 粉丝: 1w+
- 资源: 9154
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功