# MTS-Mixers
This is an official implementation of MTS-Mixers: Multivariate Time Series Forecasting via Factorized Temporal and Channel Mixing. [[paper](https://arxiv.org/abs/2302.04501)]
## Key Designs
**1. Overall Framework**
![MTS-Mixers](pics/MTS-Mixers.svg)
The architecture of MTS-Mixers comprises modules in a dashed box which defines a general framework with k-stacked blocks for capturing temporal and channel interaction. Three specific implementations are presented, which utilize attention, random matrix, or factorized MLP to capture temporal and channel dependencies. An optional input embedding is included for positional or date-specific encoding, and the instance normalization refers to [RevIN](https://openreview.net/pdf?id=cGDAkQo1C0p).
**2. Temporal Factorization**
Inspired by the fact that the original sequence and the down-sampled sequence may maintain the same temporal characteristics, we apply down-sampling to alleviate the temporal redundancy for better utilizing point-wise dependency as
$$\mathcal{X}_{T,i}=\hat{\mathcal{X}_T}[i::s, :],\quad0\leq i\leq s-1,$$
$$\mathcal{X}_T=\mathsf{merge}(\mathcal{X}_{T,0},\mathcal{X}_{T,1},\dots,\mathcal{X}_{T,s-1}),$$
where $s$ denotes the number of down-sampled subsequences and $[\cdot]$ indicates a slice operation. $\mathsf{merge}(\cdot)$ means we merge the $s$ interleaved subsequences $\mathcal{X}_{T,i}\in\mathbb{R}^{\frac{n}{s}\times c}$ into $\mathcal{X}_T\in\mathbb{R}^{n\times c}$ according to the original order for each point. Here we present an example of temporal factorization when $s=2$.
![temporal_fac](pics/temporal_fac.svg)
**3. Channel Factorization**
From the perspective of tensors, we notice that time series generally have the low-rank property. The redundancy across different channels occurs in that the information described by each channel may be consistent. Inspired by [Hamburger](https://arxiv.org/abs/2109.04553), we apply the matrix factorization to reduce the noise as
$$\hat{\mathcal{X}_C}=\mathcal{X}_C+N\approx UV+N,$$
where $N\in\mathbb{R}^{n\times c}$ represents the noise and $\mathcal{X}_C\in\mathbb{R}^{n\times c}$ denotes the channel dependency after denoising. In practice, using a channel MLP with small hidden states (less than $c$) can achieve comparable or even better performance than traditional decomposition methods.
## Get Started
1. Install Python>=3.6, PyTorch>=1.5.0.
2. Run `pip install -r requirements.txt`
3. Download data and put the `.csv` files in `./dataset`. You can obtain all the benchmarks from [Google Drive](https://drive.google.com/drive/folders/1HMDwy9ouO7FqCgvhN7jhxdFY-UCc303u). **All the datasets are well pre-processed** and can be used easily.
4. Train the model. We provide an example of running a script of all benchmarks in `script.md`. You can change any hyperparameter if necessary. See `run.py` for more details about hyper-parameter configuration.
## Main Results
We detail experiment on ten benchmarks using the 96-to-x setting, wherein we achieved promising performance on forecasting tasks. See our paper for more details.
![results](pics/results.png)
![results_2](pics/results_2.png)
**Q: Why the results of DLinear is far from the original work?**
The reason for the discrepancy between our results and those reported in DLinear's original paper is that they used a different experimental setting ("336-to-x") compared to ours ("96-to-x"). We chose a uniform setup for a fair comparison and did not deliberately lower their results.
## ☆ Minor Suggestions
Recent research in long-term time series forecasting has identified two effective techniques for significantly improving forecasting performance. One such technique, implemented in [RevIN](https://github.com/ts-kim/RevIN), involves normalizing input data prior to feeding it into the model and denormalizing final predictions as
```python
rev = RevIN(num_channels)
x = rev(x, 'norm') # [B, S, D]
pred = model(x) # [B, L, D]
pred = rev(pred, 'denorm')
```
In addition to traditional models such as encoder-decoder Transformer-based models, recent works such as DLinear, Crossformer, and PatchTST have improved numerical accuracy for long-term time series forecasting by **using a longer lookback horizon**. However, it is important to note that this may not be practical for actual prediction tasks. We hope these insights will help guide your work and avoid any potential detours.
## Citation
If you find this repo useful, please cite our paper.
```
@article{Li2023MTSMixersMT,
title={MTS-Mixers: Multivariate Time Series Forecasting via Factorized Temporal and Channel Mixing},
author={Zhe Li and Zhongwen Rao and Lujia Pan and Zenglin Xu},
journal={ArXiv},
year={2023},
volume={abs/2302.04501}
}
```
## Contact
If you have any questions or want to discuss some details, please contact plum271828@gmail.com.
## Acknowledgement
We appreciate the following github repos a lot for their valuable code base or datasets:
https://github.com/zhouhaoyi/Informer2020
https://github.com/thuml/Autoformer
https://github.com/ts-kim/RevIN
https://github.com/cure-lab/LTSF-Linear
没有合适的资源?快使用搜索试试~ 我知道了~
时间序列预测模型实战案例深度学习华为MTS-Mixers模型
共70个文件
py:25个
pyc:22个
xml:6个
5星 · 超过95%的资源 需积分: 5 93 下载量 25 浏览量
2023-10-27
15:59:02
上传
评论 4
收藏 51.93MB ZIP 举报
温馨提示
首先我们要对时间序列概念有一个基本的了解时间序列预测大致分为两种一种是单元时间序列预测另一种是多元时间序列预测单元时间序列预测是指只考虑一个时间序列的预测模型。它通常用于预测单一变量的未来值,例如股票价格、销售量等。在单元时间序列预测中,我们需要对历史数据进行分析,确定趋势、季节性和周期性等因素,并使用这些因素来预测未来的值。常见的单元时间序列预测模型有移动平均模型(MA)自回归模型(AR)自回归移动平均模型(ARMA)差分自回归移动平均模型(ARIMA)后期我也会讲一些最新的预测模型包括Informer,TPA-LSTM,ARIMA,XGBOOST,Holt-winter,移动平均法等等一系列关于时间序列预测的模型,包括深度学习和机器学习方向的模型我都会讲,你可以根据需求选取适合你自己的模型进行预测,如果有需要可以+个关注。
资源推荐
资源详情
资源评论
收起资源包目录
MTS-Mixers-main.zip (70个子文件)
layers
TransformerBlocks.py 5KB
Projection.py 745B
Invertible.py 3KB
Embedding.py 5KB
__pycache__
Invertible.cpython-39.pyc 4KB
Embedding.cpython-39.pyc 6KB
Projection.cpython-39.pyc 1KB
TransformerBlocks.cpython-39.pyc 5KB
clean.sh 67B
utils
decomposition.py 2KB
metrics.py 361B
masking.py 831B
timefeatures.py 4KB
tools.py 3KB
__pycache__
timefeatures.cpython-39.pyc 5KB
tools.cpython-39.pyc 3KB
decomposition.cpython-39.pyc 2KB
metrics.cpython-39.pyc 764B
test_results
Transformer_sum14_ftMS_sl128_ll64_pl4_dm512_nh1_el2_dl1_df2048_fc1_ebtimeF_0
0.pdf 11KB
Transformer_sum14_ftMS_sl64_ll32_pl4_dm512_nh1_el2_dl1_df2048_fc1_ebtimeF_0
0.pdf 11KB
.idea
workspace.xml 9KB
misc.xml 185B
MTS-Mixers-main.iml 482B
inspectionProfiles
Project_Default.xml 1KB
profiles_settings.xml 174B
modules.xml 289B
.gitignore 50B
aws.xml 304B
exp
exp_basic.py 878B
__pycache__
exp_main.cpython-39.pyc 9KB
exp_basic.cpython-39.pyc 2KB
exp_main.py 15KB
run.py 9KB
requirements.txt 95B
models
MTSMixer.py 4KB
DLinear.py 4KB
MTSAttn.py 2KB
Transformer.py 2KB
SCINet.py 4KB
Transformer_lite.py 2KB
MTSD.py 2KB
MTSMatrix.py 2KB
__pycache__
MTSMixer.cpython-39.pyc 4KB
DLinear.cpython-39.pyc 3KB
MTSMatrix.cpython-39.pyc 3KB
Transformer.cpython-39.pyc 2KB
MTSD.cpython-39.pyc 2KB
FNet.cpython-39.pyc 2KB
Transformer_lite.cpython-39.pyc 3KB
MTSAttn.cpython-39.pyc 3KB
SCINet.cpython-39.pyc 3KB
FNet.py 2KB
checkpoints
Transformer_sum14_ftMS_sl128_ll64_pl4_dm512_nh1_el2_dl1_df2048_fc1_ebtimeF_0
checkpoint.pth 28.16MB
Transformer_sum14_ftMS_sl64_ll32_pl4_dm512_nh1_el2_dl1_df2048_fc1_ebtimeF_0
checkpoint.pth 28.18MB
.gitignore 31B
data_provider
__init__.py 1B
data_loader.py 15KB
__pycache__
__init__.cpython-39.pyc 160B
data_loader.cpython-39.pyc 11KB
data_factory.cpython-39.pyc 1KB
data_factory.py 1KB
results
Transformer_sum14_ftM_sl64_ll32_pl4_dm512_nh1_el2_dl1_df2048_fc1_ebtimeF_0
real_prediction.npy 144B
Transformer_sum_ftM_sl126_ll48_pl1_dm512_nh1_el2_dl1_df2048_fc1_ebtimeF_0
real_prediction.npy 156B
Transformer_sum_ftM_sl126_ll48_pl4_dm512_nh1_el2_dl1_df2048_fc1_ebtimeF_0
real_prediction.npy 144B
Transformer_sum14_ftM_sl126_ll48_pl4_dm512_nh1_el2_dl1_df2048_fc1_ebtimeF_0
real_prediction.npy 144B
Transformer_sum14_ftMS_sl128_ll64_pl4_dm512_nh1_el2_dl1_df2048_fc1_ebtimeF_0
real_prediction.npy 144B
Transformer_sum14_ftMS_sl64_ll32_pl4_dm512_nh1_el2_dl1_df2048_fc1_ebtimeF_0
real_prediction.npy 144B
tranform.py 166B
README.md 5KB
script.md 5KB
共 70 条
- 1
资源评论
- lx12052024-02-29#内容详尽
Snu77
- 粉丝: 8w+
- 资源: 18
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功