# This repository is to collect BERT related resources.
AD: a repository for graph convolutional networks at https://github.com/Jiakui/awesome-gcn (resources for graph convolutional networks (图卷积神经网络相关资源)).
# Papers:
1. [arXiv:1810.04805](https://arxiv.org/abs/1810.04805), BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
, Authors: Jacob Devlin, Ming-Wei Chang, Kenton Lee, Kristina Toutanova
<details>
<summary><b> Click to see more </b></summary>
2. [arXiv:1812.06705](https://arxiv.org/abs/1812.06705), Conditional BERT Contextual Augmentation, Authors: Xing Wu, Shangwen Lv, Liangjun Zang, Jizhong Han, Songlin Hu
3. [arXiv:1812.03593](https://arxiv.org/pdf/1812.03593), SDNet: Contextualized Attention-based Deep Network for Conversational Question Answering, Authors: Chenguang Zhu, Michael Zeng, Xuedong Huang
4. [arXiv:1901.02860](https://arxiv.org/abs/1901.02860), Transformer-XL: Attentive Language Models Beyond a Fixed-Length Context, Authors: Zihang Dai, Zhilin Yang, Yiming Yang, William W. Cohen, Jaime Carbonell, Quoc V. Le and Ruslan Salakhutdinov.
5. [arXiv:1901.04085](https://arxiv.org/pdf/1901.04085.pdf), Passage Re-ranking with BERT, Authors: Rodrigo Nogueira, Kyunghyun Cho
6. [arXiv:1902.02671](https://arxiv.org/pdf/1902.02671.pdf), BERT and PALs: Projected Attention Layers for Efficient Adaptation in Multi-Task Learning, Authors: Asa Cooper Stickland, Iain Murray
7. [arXiv:1904.02232](https://arxiv.org/abs/1904.02232), BERT Post-Training for Review Reading Comprehension and Aspect-based Sentiment Analysis, Authors: Hu Xu, Bing Liu, Lei Shu, Philip S. Yu, [[code](https://github.com/howardhsu/BERT-for-RRC-ABSA)]
</details>
# Github Repositories:
## official implement:
1. [google-research/bert](https://github.com/google-research/bert), **officical** TensorFlow code and pre-trained models for BERT ,
![](https://img.shields.io/github/stars/google-research/bert.svg)
## implement of BERT besides tensorflow:
1. [codertimo/BERT-pytorch](https://github.com/codertimo/BERT-pytorch), Google AI 2018 BERT pytorch implementation,
![](https://img.shields.io/github/stars/codertimo/BERT-pytorch.svg)
2. [huggingface/pytorch-pretrained-BERT](https://github.com/huggingface/pytorch-pretrained-BERT), A PyTorch implementation of Google AI's BERT model with script to load Google's pre-trained models,
![](https://img.shields.io/github/stars/huggingface/pytorch-pretrained-BERT.svg)
3. [dmlc/gluon-nlp](https://github.com/dmlc/gluon-nlp), Gluon + MXNet implementation that reproduces BERT pretraining and finetuning on GLUE benchmark, SQuAD, etc,
![](https://img.shields.io/github/stars/dmlc/gluon-nlp.svg)
4. [dbiir/UER-py](https://github.com/dbiir/UER-py), UER-py is a toolkit for pre-training on general-domain corpus and fine-tuning on downstream task. UER-py maintains model modularity and supports research extensibility. It facilitates the use of different pre-training models (e.g. BERT), and provides interfaces for users to further extend upon.
![](https://img.shields.io/github/stars/dbiir/UER-py.svg)
5. [BrikerMan/Kashgari](https://github.com/BrikerMan/Kashgari), Simple, Keras-powered multilingual NLP framework, allows you to build your models in 5 minutes for named entity recognition (NER), part-of-speech tagging (PoS) and text classification tasks. Includes BERT, GPT-2 and word2vec embedding.
![](https://img.shields.io/github/stars/BrikerMan/Kashgari.svg)
6. [kaushaltrivedi/fast-bert](https://github.com/kaushaltrivedi/fast-bert), Super easy library for BERT based NLP models,
![](https://img.shields.io/github/stars/kaushaltrivedi/fast-bert.svg)
<details>
<summary><b> Click to see more </b></summary>
7. [Separius/BERT-keras](https://github.com/Separius/BERT-keras), Keras implementation of BERT with pre-trained weights,
![](https://img.shields.io/github/stars/Separius/BERT-keras.svg)
8. [soskek/bert-chainer](https://github.com/soskek/bert-chainer), Chainer implementation of "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding",
![](https://img.shields.io/github/stars/soskek/bert-chainer.svg)
9. [innodatalabs/tbert](https://github.com/innodatalabs/tbert), PyTorch port of BERT ML model
![](https://img.shields.io/github/stars/innodatalabs/tbert.svg)
10. [guotong1988/BERT-tensorflow](https://github.com/guotong1988/BERT-tensorflow), BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
![](https://img.shields.io/github/stars/guotong1988/BERT-tensorflow.svg)
11. [dreamgonfly/BERT-pytorch](https://github.com/dreamgonfly/BERT-pytorch),
PyTorch implementation of BERT in "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"
![](https://img.shields.io/github/stars/dreamgonfly/BERT-pytorch.svg)
12. [CyberZHG/keras-bert](https://github.com/CyberZHG/keras-bert), Implementation of BERT that could load official pre-trained models for feature extraction and prediction
![](https://img.shields.io/github/stars/CyberZHG/keras-bert.svg)
13. [soskek/bert-chainer](https://github.com/soskek/bert-chainer), Chainer implementation of "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding"
![](https://img.shields.io/github/stars/soskek/bert-chainer.svg)
14. [MaZhiyuanBUAA/bert-tf1.4.0](https://github.com/MaZhiyuanBUAA/bert-tf1.4.0), bert-tf1.4.0
![](https://img.shields.io/github/stars/MaZhiyuanBUAA/bert-tf1.4.0.svg)
15. [dhlee347/pytorchic-bert](https://github.com/dhlee347/pytorchic-bert), Pytorch Implementation of Google BERT,
![](https://img.shields.io/github/stars/dhlee347/pytorchic-bert.svg)
16. [kpot/keras-transformer](https://github.com/kpot/keras-transformer), Keras library for building (Universal) Transformers, facilitating BERT and GPT models,
![](https://img.shields.io/github/stars/kpot/keras-transformer.svg)
17. [miroozyx/BERT_with_keras](https://github.com/miroozyx/BERT_with_keras), A Keras version of Google's BERT model,
![](https://img.shields.io/github/stars/miroozyx/BERT_with_keras.svg)
18. [conda-forge/pytorch-pretrained-bert-feedstock](https://github.com/conda-forge/pytorch-pretrained-bert-feedstock), A conda-smithy repository for pytorch-pretrained-bert. ,
![](https://img.shields.io/github/stars/conda-forge/pytorch-pretrained-bert-feedstock.svg)
19. [Rshcaroline/BERT_Pytorch_fastNLP](https://github.com/Rshcaroline/BERT_Pytorch_fastNLP), A PyTorch & fastNLP implementation of Google AI's BERT model.
![](https://img.shields.io/github/stars/Rshcaroline/BERT_Pytorch_fastNLP.svg)
20. [nghuyong/ERNIE-Pytorch](https://github.com/nghuyong/ERNIE-Pytorch), ERNIE Pytorch Version,
![](https://img.shields.io/github/stars/nghuyong/ERNIE-Pytorch.svg)
</details>
## Pretrained BERT weights:
1. [brightmart/roberta_zh](https://github.com/brightmart/roberta_zh), RoBERTa for Chinese, 中文预训练RoBERTa模型,
![](https://img.shields.io/github/stars/brightmart/roberta_zh.svg)
2. [ymcui/Chinese-BERT-wwm](https://github.com/ymcui/Chinese-BERT-wwm), Pre-Training with Whole Word Masking for Chinese BERT(中文BERT-wwm预训练模型) https://arxiv.org/abs/1906.08101,
![](https://img.shields.io/github/stars/ymcui/Chinese-BERT-wwm.svg)
3. [thunlp/OpenCLaP](https://github.com/thunlp/OpenCLaP),Open Chinese Language Pre-trained Model Zoo, OpenCLaP:多领域开源中文预训练语言模型仓库,
![](https://img.shields.io/github/stars/thunlp/OpenCLaP.svg)
4. [ymcui/Chinese-PreTrained-XLNet](https://github.com/ymcui/Chinese-PreTrained-XLNet), Pre-Trained Chinese XLNet(中文XLNet预训练模型),
![](https://img.shields.io/github/stars/ymcui/Chinese-PreTrained-XLNet.svg)
5. [brightmart/xlnet_zh](https://github.com/brightmart/xlnet_zh), 中文预训练XLNet模型: Pre-Trained Chinese XLNet_Large,
![](https://img.shields.io/github/stars/brightmart/xlnet_zh.svg)
## improvement over BERT:
1. [thunlp/ERNIE](https://github.com/https://github.
没有合适的资源?快使用搜索试试~ 我知道了~
bert_nlp_papers,_applications_and__github_
共1个文件
md:1个
需积分: 5 0 下载量 27 浏览量
2024-08-24
11:59:57
上传
评论
收藏 11KB ZIP 举报
温馨提示
bert_nlp_papers,_applications_and__github_resource_awesome-bert
资源推荐
资源详情
资源评论
收起资源包目录
bert_nlp_papers,_applications_and__github_resource_awesome-bert.zip (1个子文件)
DataXujing-awesome-bert-a1e91d5
README.md 41KB
共 1 条
- 1
资源评论
好家伙VCC
- 粉丝: 2126
- 资源: 9145
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
- 大模型AI典型示范应用案例集
- AI指令合集-微头条10种框架创作指令
- 好看的邀请函PSD源文件(18个).zip
- Nvidia GeForce GTX 1080 TI显卡驱动(Win7、Win8驱动)
- AI指令合集-爆款文案优化助手
- Nvidia GeForce GTX 1080 TI显卡驱动(Win10、Win11驱动)
- GJB150A-2009军用装备实验室环境试验方法(共19份标准文件)
- 浩辰CAD看图王8.6.0最新版本下载,轻量化CAD看图软件,无需下载专业CAD软件,即可实现CAD看图、CAD图纸编辑、格式转换、三维览图等
- SW materials
- 好看的票券PSD源文件(15个).zip
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功