# WeightFreezing
Submitted to Neural Networks, [Arxiv version](https://arxiv.org/pdf/2306.05775.pdf).
# Description
Source code for the paper: Weight-Freezing: A Regularization Approach for Fully Connected Layers with an Application in EEG Classification
Due to my current inability to afford high open-access fees, the final version of this work may not be open-access. I apologize for any inconvenience caused.
## Important Statement
This is my first work on designing a transformer specifically for EEG. The preliminary experiments for a related project have been conducted, but due to the pressing deadline of my doctoral thesis, I have had to temporarily pause this aspect of my work. After October 2023, I hope to continue researching this topic if given the opportunity (That's when I might visit the University of Vienna).
My doctoral research focuses on the deep integration of artificial neural networks and neuroscience, advocating for the use of lightweight artificial neural network technologies to enhance EEG decoding. It is possible that my doctoral thesis and published papers may not have Chinese versions. However, there is a high-quality Chinese translation of the LMDA-Net paper available on WeChat official account (脑机接口社区), which interested readers can search for.
In the future, I hope to collaborate with internationally renowned research groups to further explore the applications of lightweight artificial neural networks in BCI. My research strengths in this field lie in the deep integration of digital signal processing, machine learning, deep learning, and neuroscience systems. I possess strong problem-solving abilities and have a solid foundation in mathematics and programming. I am comfortable working in an all-English office environment and capable of independently completing research tasks in this field. Additionally, I have previous experience in research on autonomous driving platforms, which has provided me with knowledge in areas such as computer vision and circuitry. I also possess strong teamwork skills.
If your research group is seeking to recruit a postdoctoral researcher in this field, I would greatly appreciate the opportunity for an interview. (mzq@tju.edu.cn)
# Requirements
- Python == 3.6 or higher
- Pytorch == 1.10 or higher
- GPU is required.
# Contributions
- To the best of our knowledge, this paper is the first to study the impact of the classifier in ANNs on EEG decoding performance. For this purpose, Weight-Freezing is proposed, which suppresses the influence of some input neurons on certain decision results by freezing some parameters in the fully connected layer, thereby achieving higher classification accuracy.
- Weight-Freezing is also a novel regularization method, which can achieve sparse connections in the fully connected network.
- Weight-Freezing is thoroughly validated and analyzed in three classic decoding networks and three highly cited public EEG datasets. The experimental results confirm the superiority of Weight-Freezing in classification and have also achieved state-of-the-art classification performance (averaged across all participants) for all the three highly cited datasets.
This study's primary contribution lies in its potent facilitation of the application and implementation of Artificial Neural Network (ANN) models within Brain-Computer Interface (BCI) systems. Simultaneously, it sets a new performance benchmark for future EEG signal decoding efforts using more sizable models, such as transformers.
Emerging research is increasingly adopting transformer networks for EEG signal decoding. These approaches can be viewed as enrichments to existing ANN models, as they elevate EEG classification accuracy via more sophisticated feature extraction networks. However, these enhancements have inadvertently complicated the deployment of these ANN models in real-world BCI systems.
In a stark contrast, our study introduces Weight-Freezing as an innovative, subtractive strategy that refines existing ANN models. Empowered by Weight-Freezing, some lightweight and shallow decoding networks surpass all current transformer-based methods in terms of classification performance on identical public datasets.
The incorporation of Weight-Freezing not only simplifies the deployment of ANN models within BCI systems but also sets a new performance standard for the deployment of larger models, such as transformers, in the future. Moreover, it provokes an intriguing question in the realm of EEG decoding: Is the deployment of large models like transformers for EEG feature extraction truly indispensable?
# Results
![33f3428681103234abb0acb07c6a6ca](https://github.com/MiaoZhengQing/WeightFreezing/assets/116713490/abb617bd-f3ae-418f-9dd5-5ffb24cbbb4f)
![6b598f8a5dfeff920c909b9f93f4a09](https://github.com/MiaoZhengQing/WeightFreezing/assets/116713490/5a86123d-852c-405d-b98b-539e039243a6)
# Models Implemented
- [LMDA-Net](https://doi.org/10.1016/j.neuroimage.2023.120209)
- [EEGNet](https://github.com/vlawhern/arl-eegmodels)
- [ShallowConvNet](https://github.com/TNTLFreiburg/braindecode)
# Related works
- This paper is a follow-up version of [SDDA](https://arxiv.org/pdf/2202.09559.pdf) and [LMDA-Net](https://doi.org/10.1016/j.neuroimage.2023.120209), the preprocessing method is inherited from SDDA.
# Paper Citation
If you use this idea and code in a scientific publication, please cite us as:
% Weight-Freezing
Miao Z, Zhao M. Weight Freezing: A Regularization Approach for Fully Connected Layers with an Application in EEG Classification[J]. arXiv preprint arXiv:2306.05775, 2023.
% LMDA-Net
Miao Z, Zhang X, Zhao M, et al. LMDA-Net: A lightweight multi-dimensional attention network for general EEG-based brain-computer interface paradigms and interpretability[J]. arXiv preprint arXiv:2303.16407, 2023.
% SDDA
Miao Z, Zhang X, Menon C, et al. Priming Cross-Session Motor Imagery Classification with A Universal Deep Domain Adaptation Framework[J]. arXiv preprint arXiv:2202.09559, 2022.
```
% Weight-Freezing
@article{miao2023weight,
title={Weight Freezing: A Regularization Approach for Fully Connected Layers with an Application in EEG Classification},
author={Miao, Zhengqing and Zhao, Meirong},
journal={arXiv preprint arXiv:2306.05775},
year={2023},
doi={https://doi.org/10.48550/arXiv.2306.05775},
}
% LMDA
@article{miao2023lmda,
title = {LMDA-Net:A lightweight multi-dimensional attention network for general EEG-based brain-computer interfaces and interpretability},
journal = {NeuroImage},
volume = {276},
pages = {120209},
year = {2023},
issn = {1053-8119},
doi = {https://doi.org/10.1016/j.neuroimage.2023.120209},
url = {https://www.sciencedirect.com/science/article/pii/S1053811923003609},
author = {Zhengqing Miao and Meirong Zhao and Xin Zhang and Dong Ming},
keywords = {Attention, Brain-computer interface (BCI), Electroencephalography (EEG), Model interpretability, Neural networks},
abstract = {Electroencephalography (EEG)-based brain-computer interfaces (BCIs) pose a challenge for decoding due to their low spatial resolution and signal-to-noise ratio. Typically, EEG-based recognition of activities and states involves the use of prior neuroscience knowledge to generate quantitative EEG features, which may limit BCI performance. Although neural network-based methods can effectively extract features, they often encounter issues such as poor generalization across datasets, high predicting volatility, and low model interpretability. To address these limitations, we propose a novel lightweight multi-dimensional attention network, called LMDA-Net. By incorporating two novel attention modules designed specifically for EEG signals, the channel attention module and the depth attention module, LMDA-Net is able to effectively integrate features from multiple dimensions, resulting in improved classification performance across various BCI tasks. LMDA-Net was evaluated on four high-impact public datasets, includin
没有合适的资源?快使用搜索试试~ 我知道了~
【脑机接口论文与程序源代码】权重冻结:一种全连通层的正则化方法及其在脑电分类中的应用
共4个文件
py:2个
pdf:1个
md:1个
0 下载量 51 浏览量
2023-09-15
23:14:50
上传
评论
收藏 1.4MB ZIP 举报
温馨提示
【脑机接口论文与程序源代码】权重冻结:一种全连通层的正则化方法及其在脑电分类中的应用
资源推荐
资源详情
资源评论
收起资源包目录
WeightFreezing-main.zip (4个子文件)
WeightFreezing-main
WeightFreezing_中文AI翻译版(未校对).pdf 2.26MB
modelsWithWeightFreezing.py 10KB
compared_models.py 7KB
README.md 9KB
共 4 条
- 1
资源评论
高山仰止景
- 粉丝: 1522
- 资源: 22
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功