# ETSformer: Exponential Smoothing Transformers for Time-series Forecasting
<p align="center">
<img src=".\pics\etsformer.png" width = "700" alt="" align=center />
<br><br>
<b>Figure 1.</b> Overall ETSformer Architecture.
</p>
Official PyTorch code repository for the [ETSformer paper](https://arxiv.org/abs/2202.01381). Check out our [blog post](https://blog.salesforceairesearch.com/etsformer-time-series-forecasting/)!
* ETSformer is a novel time-series Transformer architecture which exploits the principle of exponential smoothing in improving
Transformers for timeseries forecasting.
* ETSformer is inspired by the classical exponential smoothing methods in
time-series forecasting, leveraging the novel exponential smoothing attention (ESA) and frequency attention (FA) to
replace the self-attention mechanism in vanilla Transformers, thus improving both accuracy and efficiency.
## Requirements
1. Install Python 3.8, and the required dependencies.
2. Required dependencies can be installed by: ```pip install -r requirements.txt```
## Data
* Pre-processed datasets can be downloaded from the following
links, [Tsinghua Cloud](https://cloud.tsinghua.edu.cn/d/e1ccfff39ad541908bae/)
or [Google Drive](https://drive.google.com/drive/folders/1ZOYpTUa82_jCcxIdTmyr0LXQfvaM9vIy?usp=sharing), as obtained
from [Autoformer's](https://github.com/thuml/Autoformer) GitHub repository.
* Place the downloaded datasets into the `dataset/` folder, e.g. `dataset/ETT-small/ETTm2.csv`.
## Usage
1. Install the required dependencies.
2. Download data as above, and place them in the folder, `dataset/`.
3. Train the model. We provide the experiment scripts of all benchmarks under the folder `./scripts`,
e.g. `./scripts/ETTm2.sh`. You might have to change permissions on the script files by running`chmod u+x scripts/*`.
4. The script for grid search is also provided, and can be run by `./grid_search.sh`.
## Acknowledgements
The implementation of ETSformer relies on resources from the following codebases and repositories, we thank the original
authors for open-sourcing their work.
* https://github.com/thuml/Autoformer
* https://github.com/zhouhaoyi/Informer2020
## Citation
Please consider citing if you find this code useful to your research.
<pre>@article{woo2022etsformer,
title={ETSformer: Exponential Smoothing Transformers for Time-series Forecasting},
author={Gerald Woo and Chenghao Liu and Doyen Sahoo and Akshat Kumar and Steven C. H. Hoi},
year={2022},
url={https://arxiv.org/abs/2202.01381},
}</pre>
没有合适的资源?快使用搜索试试~ 我知道了~
Transformers时间序列预测Exponential Smoothing Transformers(Python完整源码)
共37个文件
py:20个
sh:9个
md:3个
1.该资源内容由用户上传,如若侵权请联系客服进行举报
2.虚拟产品一经售出概不退款(资源遇到问题,请及时私信上传者)
2.虚拟产品一经售出概不退款(资源遇到问题,请及时私信上传者)
版权申诉
5星 · 超过95%的资源 4 下载量 99 浏览量
2023-01-12
17:43:28
上传
评论 1
收藏 458KB ZIP 举报
温馨提示
Transformers时间序列预测Exponential Smoothing Transformers(Python完整源码) Exponential Smoothing Transformers for Time-series Forecasting ETSformer 是一种新颖的时间序列 Transformer 架构,它利用指数平滑原理改进时间序列预测的 Transformer。 ETSformer 受到时间序列预测中经典的指数平滑方法的启发,利用新颖的指数平滑注意 (ESA) 和频率注意 (FA) 来替代 vanilla Transformers 中的自我注意机制,从而提高准确性和效率。
资源推荐
资源详情
资源评论
收起资源包目录
ETSformer-main.zip (37个子文件)
ETSformer-main
CODEOWNERS 141B
SECURITY.md 400B
LICENSE.txt 1KB
utils
__init__.py 0B
metrics.py 866B
masking.py 832B
Adam.py 6KB
timefeatures.py 4KB
tools.py 4KB
dataset
.gitignore 70B
exp
__init__.py 0B
exp_basic.py 885B
exp_main.py 9KB
run.py 6KB
CODE_OF_CONDUCT.md 5KB
requirements.txt 111B
models
__init__.py 39B
etsformer
__init__.py 0B
model.py 3KB
decoder.py 3KB
exponential_smoothing.py 2KB
modules.py 1KB
encoder.py 8KB
pics
etsformer.png 493KB
data_provider
__init__.py 1B
data_loader.py 14KB
data_factory.py 1KB
README.md 2KB
scripts
ETTm2_univar.sh 1KB
ILI.sh 1KB
ETTm2.sh 1KB
Traffic.sh 1KB
Exchange_univar.sh 1KB
Exchange.sh 1KB
Weather.sh 1KB
grid_search.sh 4KB
ECL.sh 1KB
共 37 条
- 1
前程算法屋
- 粉丝: 4176
- 资源: 712
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功
- 1
- 2
- 3
前往页