# ABSA-PyTorch
> Aspect Based Sentiment Analysis, PyTorch Implementations.
>
> åºäºæ¹é¢çæ
æåæï¼ä½¿ç¨PyTorchå®ç°ã
![LICENSE](https://img.shields.io/packagist/l/doctrine/orm.svg)
[![Gitter](https://badges.gitter.im/ABSA-PyTorch/community.svg)](https://gitter.im/ABSA-PyTorch/community?utm_source=badge&utm_medium=badge&utm_campaign=pr-badge)
<!-- ALL-CONTRIBUTORS-BADGE:START - Do not remove or modify this section -->
[![All Contributors](https://img.shields.io/badge/all_contributors-9-orange.svg?style=flat-square)](#contributors-)
<!-- ALL-CONTRIBUTORS-BADGE:END -->
## Requirement
* pytorch >= 0.4.0
* numpy >= 1.13.3
* sklearn
* python 3.6 / 3.7
* transformers
To install requirements, run `pip install -r requirements.txt`.
* For non-BERT-based models,
[GloVe pre-trained word vectors](https://github.com/stanfordnlp/GloVe#download-pre-trained-word-vectors) are required, please refer to [data_utils.py](./data_utils.py) for more detail.
## Usage
### Training
```sh
python train.py --model_name bert_spc --dataset restaurant
```
* All implemented models are listed in [models directory](./models/).
* See [train.py](./train.py) for more training arguments.
* Refer to [train_k_fold_cross_val.py](./train_k_fold_cross_val.py) for k-fold cross validation support.
### Inference
* Refer to [infer_example.py](./infer_example.py) for both non-BERT-based models and BERT-based models.
### Tips
* For non-BERT-based models, training procedure is not very stable.
* BERT-based models are more sensitive to hyperparameters (especially learning rate) on small data sets, see [this issue](https://github.com/songyouwei/ABSA-PyTorch/issues/27).
* Fine-tuning on the specific task is necessary for releasing the true power of BERT.
## Reviews / Surveys
Qiu, Xipeng, et al. "Pre-trained Models for Natural Language Processing: A Survey." arXiv preprint arXiv:2003.08271 (2020). [[pdf]](https://arxiv.org/pdf/2003.08271)
Zhang, Lei, Shuai Wang, and Bing Liu. "Deep Learning for Sentiment Analysis: A Survey." arXiv preprint arXiv:1801.07883 (2018). [[pdf]](https://arxiv.org/pdf/1801.07883)
Young, Tom, et al. "Recent trends in deep learning based natural language processing." arXiv preprint arXiv:1708.02709 (2017). [[pdf]](https://arxiv.org/pdf/1708.02709)
## BERT-based models
### BERT-ADA ([official](https://github.com/deepopinion/domain-adapted-atsc))
Rietzler, Alexander, et al. "Adapt or get left behind: Domain adaptation through bert language model finetuning for aspect-target sentiment classification." arXiv preprint arXiv:1908.11860 (2019). [[pdf](https://arxiv.org/pdf/1908.11860)]
### BERR-PT ([official](https://github.com/howardhsu/BERT-for-RRC-ABSA))
Xu, Hu, et al. "Bert post-training for review reading comprehension and aspect-based sentiment analysis." arXiv preprint arXiv:1904.02232 (2019). [[pdf](https://arxiv.org/pdf/1904.02232)]
### ABSA-BERT-pair ([official](https://github.com/HSLCY/ABSA-BERT-pair))
Sun, Chi, Luyao Huang, and Xipeng Qiu. "Utilizing bert for aspect-based sentiment analysis via constructing auxiliary sentence." arXiv preprint arXiv:1903.09588 (2019). [[pdf](https://arxiv.org/pdf/1903.09588.pdf)]
### LCF-BERT ([lcf_bert.py](./models/lcf_bert.py)) ([official](https://github.com/yangheng95/LCF-ABSA))
Zeng Biqing, Yang Heng, et al. "LCF: A Local Context Focus Mechanism for Aspect-Based Sentiment Classification." Applied Sciences. 2019, 9, 3389. [[pdf]](https://www.mdpi.com/2076-3417/9/16/3389/pdf)
### AEN-BERT ([aen.py](./models/aen.py))
Song, Youwei, et al. "Attentional Encoder Network for Targeted Sentiment Classification." arXiv preprint arXiv:1902.09314 (2019). [[pdf]](https://arxiv.org/pdf/1902.09314.pdf)
### BERT for Sentence Pair Classification ([bert_spc.py](./models/bert_spc.py))
Devlin, Jacob, et al. "Bert: Pre-training of deep bidirectional transformers for language understanding." arXiv preprint arXiv:1810.04805 (2018). [[pdf]](https://arxiv.org/pdf/1810.04805.pdf)
## Non-BERT-based models
### ASGCN ([asgcn.py](./models/asgcn.py)) ([official](https://github.com/GeneZC/ASGCN))
Zhang, Chen, et al. "Aspect-based Sentiment Classification with Aspect-specific Graph Convolutional Networks." Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing. 2019. [[pdf]](https://www.aclweb.org/anthology/D19-1464)
### MGAN ([mgan.py](./models/mgan.py))
Fan, Feifan, et al. "Multi-grained Attention Network for Aspect-Level Sentiment Classification." Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing. 2018. [[pdf]](http://aclweb.org/anthology/D18-1380)
### AOA ([aoa.py](./models/aoa.py))
Huang, Binxuan, et al. "Aspect Level Sentiment Classification with Attention-over-Attention Neural Networks." arXiv preprint arXiv:1804.06536 (2018). [[pdf]](https://arxiv.org/pdf/1804.06536.pdf)
### TNet ([tnet_lf.py](./models/tnet_lf.py)) ([official](https://github.com/lixin4ever/TNet))
Li, Xin, et al. "Transformation Networks for Target-Oriented Sentiment Classification." arXiv preprint arXiv:1805.01086 (2018). [[pdf]](https://arxiv.org/pdf/1805.01086)
### Cabasc ([cabasc.py](./models/cabasc.py))
Liu, Qiao, et al. "Content Attention Model for Aspect Based Sentiment Analysis." Proceedings of the 2018 World Wide Web Conference on World Wide Web. International World Wide Web Conferences Steering Committee, 2018.
### RAM ([ram.py](./models/ram.py))
Chen, Peng, et al. "Recurrent Attention Network on Memory for Aspect Sentiment Analysis." Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing. 2017. [[pdf]](http://www.aclweb.org/anthology/D17-1047)
### MemNet ([memnet.py](./models/memnet.py)) ([official](https://drive.google.com/open?id=1Hc886aivHmIzwlawapzbpRdTfPoTyi1U))
Tang, Duyu, B. Qin, and T. Liu. "Aspect Level Sentiment Classification with Deep Memory Network." Conference on Empirical Methods in Natural Language Processing 2016:214-224. [[pdf]](https://arxiv.org/pdf/1605.08900)
### IAN ([ian.py](./models/ian.py))
Ma, Dehong, et al. "Interactive Attention Networks for Aspect-Level Sentiment Classification." arXiv preprint arXiv:1709.00893 (2017). [[pdf]](https://arxiv.org/pdf/1709.00893)
### ATAE-LSTM ([atae_lstm.py](./models/atae_lstm.py))
Wang, Yequan, Minlie Huang, and Li Zhao. "Attention-based lstm for aspect-level sentiment classification." Proceedings of the 2016 conference on empirical methods in natural language processing. 2016.
### TD-LSTM ([td_lstm.py](./models/td_lstm.py), [tc_lstm.py](./models/tc_lstm.py)) ([official](https://drive.google.com/open?id=17RF8MZs456ov9MDiUYZp0SCGL6LvBQl6))
Tang, Duyu, et al. "Effective LSTMs for Target-Dependent Sentiment Classification." Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers. 2016. [[pdf]](https://arxiv.org/pdf/1512.01100)
### LSTM ([lstm.py](./models/lstm.py))
Hochreiter, Sepp, and Jürgen Schmidhuber. "Long short-term memory." Neural computation 9.8 (1997): 1735-1780. [[pdf](http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.676.4320&rep=rep1&type=pdf)]
## Note on running with RTX30*
If you are running on RTX30 series there may be some compatibility issues between installed/required versions of torch, cuda.
In that case try using `requirements_rtx30.txt` instead of `requirements.txt`.
## Contributors
Thanks goes to these wonderful people:
<!-- ALL-CONTRIBUTORS-LIST:START - Do not remove or modify this section -->
<!-- prettier-ignore-start -->
<!-- markdownlint-disable -->
<table>
<tr>
<td align="center"><a href="https://github.com/AlbertoPaz"><img src="https://avatars2.githubusercontent.com/u/36967362?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Alberto Paz</b></sub></a><br /><a href="https://github.com/songyouwei/ABSA-PyTorch/commits?author=AlbertoPaz" title="Code">ð»</a></td>
<td align="center"><a href="http://taojiang0923@gmail.com"><img src="https://avatars0.githubuserconten
没有合适的资源?快使用搜索试试~ 我知道了~
温馨提示
Aspect Based Sentiment Analysis, PyTorch Implementations. 基于方面的情感分析,使用PyTorch实现。 ABSA-PyTorch Aspect Based Sentiment Analysis, PyTorch Implementations. 基于方面的情感分析,使用PyTorch实现。 Requirement pytorch >= 0.4.0 numpy >= 1.13.3 sklearn python 3.6 / 3.7 transformers To install requirements, run pip install -r requirements.txt. For non-BERT-based models, GloVe pre-trained word vectors are required, please refer to data_utils.py for more detail. Usage Training python train.py --model_nam
资源详情
资源评论
资源推荐
收起资源包目录
ABSA-PyTorch-master.zip (45个子文件)
ABSA-PyTorch-master
layers
dynamic_rnn.py 4KB
__init__.py 130B
squeeze_embedding.py 1KB
point_wise_feed_forward.py 841B
attention.py 4KB
train.py 13KB
requirements_rtx30.txt 125B
data_utils.py 8KB
models
tnet_lf.py 3KB
lstm.py 839B
bert_spc.py 715B
ram.py 3KB
__init__.py 549B
asgcn.py 4KB
lcf_bert.py 5KB
aoa.py 2KB
memnet.py 2KB
atae_lstm.py 2KB
ian.py 2KB
mgan.py 5KB
aen.py 5KB
cabasc.py 6KB
td_lstm.py 1KB
tc_lstm.py 2KB
requirements.txt 62B
datasets
semeval14
Laptops_Train.xml.seg.graph 5.73MB
Laptops_Test_Gold.xml.seg.graph 1.15MB
Restaurants_Train.xml.seg 378KB
Laptops_Train.xml.seg 265KB
Restaurants_Test_Gold.xml.seg.graph 2MB
Laptops_Test_Gold.xml.seg 60KB
Restaurants_Test_Gold.xml.seg 113KB
Restaurants_Train.xml.seg.graph 7.23MB
acl-14-short-data
train.raw 653KB
test.raw.graph 1.32MB
test.raw 73KB
train.raw.graph 11.7MB
readme.txt 1KB
dependency_graph.py 2KB
LICENCE 1KB
README.md 10KB
.all-contributorsrc 2KB
train_k_fold_cross_val.py 14KB
.gitignore 397B
infer_example.py 7KB
共 45 条
- 1
weixin_42128015
- 粉丝: 25
- 资源: 4640
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
- IPv6和ICMPv6等
- Módulo I da Trilha“JavaScript 开发人员”参考资料库 .zip
- MyBatis 3 的 Spring 集成.zip
- LibRec领先的推荐系统 Java 库,请参阅.zip
- 修改LATEX.pdf
- IMG_20241125_120800.jpg
- AI助手Copilot辅助Go+Flutter打造全栈式在线教育系统课程17章
- AssetStudioGUI官方版是一款简易实用,功能全面的图像处理软件,AssetStudioGUI官方版能够提取游戏中的立绘和动画资源的工具,且功能非常全面,支持动画的导出,是动画制作人员得力的助
- 2024下半年,CISSP官方10道练习题
- JD-Core是一个用JAVA编写的JAVA反编译器 .zip
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功
评论0