![](./docs/source/_static/logo.svg)
Our article on [Towards Data Science](https://towardsdatascience.com/introducing-pytorch-forecasting-64de99b9ef46)
introduces the package and provides background information.
Pytorch Forecasting aims to ease state-of-the-art timeseries forecasting with neural networks for real-world cases and research alike. The goal is to provide a high-level API with maximum flexibility for professionals and reasonable defaults for beginners.
Specifically, the package provides
- A timeseries dataset class which abstracts handling variable transformations, missing values,
randomized subsampling, multiple history lengths, etc.
- A base model class which provides basic training of timeseries models along with logging in tensorboard
and generic visualizations such actual vs predictions and dependency plots
- Multiple neural network architectures for timeseries forecasting that have been enhanced
for real-world deployment and come with in-built interpretation capabilities
- Multi-horizon timeseries metrics
- Ranger optimizer for faster model training
- Hyperparameter tuning with [optuna](https://optuna.readthedocs.io/)
The package is built on [pytorch-lightning](https://pytorch-lightning.readthedocs.io/) to allow training on CPUs,
single and multiple GPUs out-of-the-box.
# Installation
If you are working on windows, you need to first install PyTorch with
`pip install torch -f https://download.pytorch.org/whl/torch_stable.html`.
Otherwise, you can proceed with
`pip install pytorch-forecasting`
Alternatively, you can install the package via conda
`conda install pytorch-forecasting pytorch -c pytorch>=1.7 -c conda-forge`
PyTorch Forecasting is now installed from the conda-forge channel while PyTorch is install from the pytorch channel.
# Documentation
Visit [https://pytorch-forecasting.readthedocs.io](https://pytorch-forecasting.readthedocs.io) to read the
documentation with detailed tutorials.
# Available models
- [Temporal Fusion Transformers for Interpretable Multi-horizon Time Series Forecasting](https://arxiv.org/pdf/1912.09363.pdf)
which outperforms DeepAR by Amazon by 36-69% in benchmarks
- [N-BEATS: Neural basis expansion analysis for interpretable time series forecasting](http://arxiv.org/abs/1905.10437)
which has (if used as ensemble) outperformed all other methods including ensembles of traditional statical
methods in the M4 competition. The M4 competition is arguably the most important benchmark for univariate time series forecasting.
- [DeepAR: Probabilistic forecasting with autoregressive recurrent networks](https://www.sciencedirect.com/science/article/pii/S0169207019301888)
which is the one of the most popular forecasting algorithms and is often used as a baseline
- A baseline model that always predicts the latest known value
- Simple standard networks for baselining: LSTM and GRU networks as well as a MLP on the decoder
To implement new models, see the [How to implement new models tutorial](https://pytorch-forecasting.readthedocs.io/en/latest/tutorials/building.html).
It covers basic as well as advanced architectures.
# Usage
```python
import pytorch_lightning as pl
from pytorch_lightning.callbacks import EarlyStopping, LearningRateMonitor
from pytorch_forecasting.metrics import QuantileLoss
from pytorch_forecasting import TimeSeriesDataSet, TemporalFusionTransformer
# load data
data = ...
# define dataset
max_encoder_length = 36
max_prediction_length = 6
training_cutoff = "YYYY-MM-DD" # day for cutoff
training = TimeSeriesDataSet(
data[lambda x: x.date <= training_cutoff],
time_idx= ...,
target= ...,
group_ids=[ ... ],
max_encoder_length=max_encoder_length,
max_prediction_length=max_prediction_length,
static_categoricals=[ ... ],
static_reals=[ ... ],
time_varying_known_categoricals=[ ... ],
time_varying_known_reals=[ ... ],
time_varying_unknown_categoricals=[ ... ],
time_varying_unknown_reals=[ ... ],
)
validation = TimeSeriesDataSet.from_dataset(training, data, min_prediction_idx=training.index.time.max() + 1, stop_randomization=True)
batch_size = 128
train_dataloader = training.to_dataloader(train=True, batch_size=batch_size, num_workers=2)
val_dataloader = validation.to_dataloader(train=False, batch_size=batch_size, num_workers=2)
early_stop_callback = EarlyStopping(monitor="val_loss", min_delta=1e-4, patience=1, verbose=False, mode="min")
lr_logger = LearningRateMonitor()
trainer = pl.Trainer(
max_epochs=100,
gpus=0,
gradient_clip_val=0.1,
limit_train_batches=30,
callbacks=[lr_logger, early_stop_callback],
)
tft = TemporalFusionTransformer.from_dataset(
training,
learning_rate=0.03,
hidden_size=32,
attention_head_size=1,
dropout=0.1,
hidden_continuous_size=16,
output_size=7,
loss=QuantileLoss(),
log_interval=2,
reduce_on_plateau_patience=4
)
print(f"Number of parameters in network: {tft.size()/1e3:.1f}k")
# find optimal learning rate
res = trainer.lr_find(
tft, train_dataloader=train_dataloader, val_dataloaders=val_dataloader, early_stop_threshold=1000.0, max_lr=0.3,
)
print(f"suggested learning rate: {res.suggestion()}")
fig = res.plot(show=True, suggest=True)
fig.show()
trainer.fit(
tft, train_dataloader=train_dataloader, val_dataloaders=val_dataloader,
)
```
没有合适的资源?快使用搜索试试~ 我知道了~
温馨提示
我们关于“ 文章介绍了该软件包,并提供了背景信息。 Pytorch Forecasting旨在通过神经网络简化实际案例和研究的最新时间序列预测。目标是为高级专业人员提供最大程度的灵活性,并为初学者提供合理的默认值的高级API。具体来说,该软件包提供了 一个时间序列数据集类,它抽象化处理变量转换,缺失值,随机子采样,多个历史记录长度等。 基本模型类,提供时间序列模型的基本训练,以及在张量板中的记录和通用可视化,例如实际与预测以及依存关系图 用于时间序列预测的多种神经网络体系结构已针对实际部署进行了增强,并具有内置的解释功能 多地平线时间序列指标 Ranger优化器,用于更快的模型训练 使用调整 该程序包基于构建,可以直接使用CPU,单个和多个GPU进行培训。 安装 如果您在Windows上工作,则需要先使用以下命令安装PyTorch: pip install torch -f https
资源详情
资源评论
资源推荐
收起资源包目录
pytorch-forecasting-master.zip (79个子文件)
pytorch-forecasting-master
codecov.yml 121B
.github
ISSUE_TEMPLATE.md 573B
PULL_REQUEST_TEMPLATE.md 428B
workflows
pypi_release.yml 1KB
automerge.yml 782B
test.yml 4KB
code_quality.yml 2KB
poetry.lock 202KB
pytorch_forecasting
models
deepar
__init__.py 20KB
basic_rnn
__init__.py 11KB
baseline.py 2KB
__init__.py 978B
nn
embeddings.py 3KB
__init__.py 217B
rnn.py 8KB
mlp
submodules.py 2KB
__init__.py 7KB
base_model.py 78KB
nbeats
sub_modules.py 6KB
__init__.py 16KB
rnn
__init__.py 13KB
temporal_fusion_transformer
tuning.py 9KB
sub_modules.py 16KB
__init__.py 36KB
utils.py 10KB
__init__.py 2KB
optim.py 8KB
metrics.py 38KB
data
examples.py 3KB
encoders.py 33KB
__init__.py 642B
timeseries.py 87KB
examples
stallion.py 6KB
nbeats.py 3KB
ar.py 4KB
data
stallion.parquet 1.06MB
pytest.ini 330B
LICENSE 1KB
setup.cfg 1KB
README.md 5KB
.pre-commit-config.yaml 799B
docs
source
_templates
custom-base-template.rst 121B
custom-module-template.rst 1KB
custom-class-template.rst 1KB
models.rst 9KB
contribute.rst 1KB
faq.rst 4KB
tutorials
stallion.ipynb 2.12MB
ar.ipynb 928KB
building.ipynb 187KB
getting-started.rst 5KB
conf.py 5KB
metrics.rst 2KB
tutorials.rst 295B
index.rst 2KB
data.rst 1KB
api.rst 188B
_static
favicon.png 785B
custom.css 694B
logo.svg 5KB
favicon.svg 3KB
make.bat 764B
requirements.txt 217B
Makefile 638B
tests
test_data.py 21KB
test_utils
test_autocorrelation.py 436B
test_models
test_rnn_model.py 4KB
test_deepar.py 4KB
test_temporal_fusion_transformer.py 8KB
conftest.py 6KB
test_mlp.py 3KB
test_nn
test_rnn.py 2KB
test_nbeats.py 2KB
conftest.py 2KB
test_metrics.py 8KB
.readthedocs.yml 686B
.gitignore 2KB
pyproject.toml 3KB
CHANGELOG.md 11KB
共 79 条
- 1
风花雪月不等人
- 粉丝: 26
- 资源: 4645
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功
评论2