# **:triangular_flag_on_post: The complete code and scripts of TimesNet have been included in [[Time-Series-Library]](https://github.com/thuml/Time-Series-Library).**
# **:triangular_flag_on_post: The complete code and scripts of TimesNet have been included in [[Time-Series-Library]](https://github.com/thuml/Time-Series-Library).**
# **:triangular_flag_on_post: The complete code and scripts of TimesNet have been included in [[Time-Series-Library]](https://github.com/thuml/Time-Series-Library).**
TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis [[ICLR 2023]](https://openreview.net/pdf?id=ju_Uqw384Oq)
<p align="center">
<img src=".\pic\overall.png" height = "300" alt="" align=center />
</p>
In this paper, we present TimesNet as a powerful foundation model for general time series analysis, which can
- ð Achieve the consistent state-of-the-art in five main-stream tasks: **Long- and Short-term Forecasting, Imputation, Anomaly Detection and Classification**.
- ð Directly take advantage of booming vision backbones by transforming the 1D time series into 2D space.
## Temporal 1D-Variation vs. 2D-Variation
Temporal variation modeling is the common key problem of extensive analysis tasks. Previous methods attempt to accomplish this directly from the 1D time series, which is extremely challenging due to the intricate temporal patterns. Based on the observation of multi-periodicity in time series, we present the TimesNet to **transform the origianl 1D-timeseries into 2D Space**, which can unfiy the intraperiod- and interperiod-variations.
<p align="center">
<img src=".\pic\timesnet.png" height = "300" alt="" align=center />
</p>
## General Representation Learning Capacity
To demonstrate the model capacity in representation learning, we calculate the [CKA similarity](https://github.com/jayroxis/CKA-similarity) between representations from the bottom and top layer of each model. A smaller CKA similarity means that the representations of bottom and top layer are more distinct, indicating the hierarchical representations. From this representation analysis, We find that:
- **Forecasting and anomaly detection tasks require the low-level representations.**
- **Imputation and classification tasks expect the hierarchical representations.**
Benefiting from 2D kernel design, **TimesNet (marked by red stars) can learn appropriate representations for different tasks**, demonstrating its task generality as a foundation model.
<p align="center">
<img src=".\pic\representation.png" height = "200" alt="" align=center />
</p>
## Leaderboard for Time Series Analysis
In this paper, we also provide a comprehensive benchmark to evaluate different backbones. **More than 15 advanced baselines are compared.** Till February 2023, the top three models for five different tasks are:
| Model<br>Ranking | Long-term<br>Forecasting | Short-term<br>Forecasting | Imputation | Anomaly<br>Detection | Classification |
| ---------------- | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | -------------------------------------------------- |
| ð¥ 1st | [TimesNet](https://arxiv.org/abs/2210.02186) | [TimesNet](https://arxiv.org/abs/2210.02186) | [TimesNet](https://arxiv.org/abs/2210.02186) | [TimesNet](https://arxiv.org/abs/2210.02186) | [TimesNet](https://arxiv.org/abs/2210.02186) |
| ð¥ 2nd | [DLinear](https://github.com/cure-lab/LTSF-Linear) | [Non-stationary<br/>Transformer](https://github.com/thuml/Nonstationary_Transformers) | [Non-stationary<br/>Transformer](https://github.com/thuml/Nonstationary_Transformers) | [Non-stationary<br/>Transformer](https://github.com/thuml/Nonstationary_Transformers) | [FEDformer](https://github.com/MAZiqing/FEDformer) |
| ð¥ 3rd | [Non-stationary<br>Transformer](https://github.com/thuml/Nonstationary_Transformers) | [FEDformer](https://github.com/MAZiqing/FEDformer) | [Autoformer](https://github.com/thuml/Autoformer) | [Informer](https://github.com/zhouhaoyi/Informer2020) | [Autoformer](https://github.com/thuml/Autoformer) |
See our [paper](https://openreview.net/pdf?id=ju_Uqw384Oq) for the comprehensive benchmark.
## Citation
If you find this repo useful, please cite our paper.
```
@inproceedings{wu2023timesnet,
title={TimesNet: Temporal 2D-Variation Modeling for General Time Series Analysis},
author={Haixu Wu and Tengge Hu and Yong Liu and Hang Zhou and Jianmin Wang and Mingsheng Long},
booktitle={International Conference on Learning Representations},
year={2023},
}
```
## Contact
If you have any questions, please contact whx20@mails.tsinghua.edu.cn.
## Acknowledgement
We appreciate the following github repos for their valuable codebase:
- Forecasting: https://github.com/thuml/Autoformer
- Anomaly Detection: https://github.com/thuml/Anomaly-Transformer
- Classification: https://github.com/thuml/Flowformer
没有合适的资源?快使用搜索试试~ 我知道了~
温馨提示
时间序列分析 一个时间序列通常由4种要素组成:趋势、季节变动、循环波动和不规则波动。 趋势:是时间序列在长时期内呈现出来的持续向上或持续向下的变动。 季节变动:是时间序列在一年内重复出现的周期性波动。它是诸如气候条件、生产条件、节假日或人们的风俗习惯等各种因素影响的结果。 循环波动:是时间序列呈现出得非固定长度的周期性变动。循环波动的周期可能会持续一段时间,但与趋势不同,它不是朝着单一方向的持续变动,而是涨落相同的交替波动。 不规则波动:是时间序列中除去趋势、季节变动和周期波动之后的随机波动。不规则波动通常总是夹杂在时间序列中,致使时间序列产生一种波浪形或震荡式的变动。只含有随机波动的序列也称为平稳序列。 时间序列建模基本步骤是:①用观测、调查、统计、抽样等方法取得被观测系统时间序列动态数据。②根据动态数据作相关图,进行相关分析,求自相关函数。相关图能显示出变化的趋势和周期,并能发现跳点和拐点。跳点是指与其他数据不一致的观测值。如果跳点是正确的观测值,在建模时应考虑进去,如果是反常现象,则应把跳点调整到期望值。拐点则是指时间序列从上升趋势突然变为下降趋势的点。
资源推荐
资源详情
资源评论
收起资源包目录
提出TimesNet作为一般时间序列分析的一个强大的基础模型 在长短期预测、插补、异常检测和分类5个主流任务上取得了一致的前沿成果.zip (8个子文件)
新建文本文档.txt 2KB
TimesNet-main
pic
overall.png 819KB
dataset.png 195KB
representation.png 227KB
timesnet.png 1.11MB
LICENSE 1KB
.gitignore 2KB
README.md 5KB
共 8 条
- 1
资源评论
野生的狒狒
- 粉丝: 2300
- 资源: 2037
下载权益
C知道特权
VIP文章
课程特权
开通VIP
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功