# InternLM
<div align="center">
<img src="./doc/imgs/logo.svg" width="200"/>
<div>Â </div>
<div align="center">
<b><font size="5">InternLM</font></b>
<sup>
<a href="https://internlm.intern-ai.org.cn/">
<i><font size="4">HOT</font></i>
</a>
</sup>
<div>Â </div>
</div>
[![license](./doc/imgs/license.svg)](./LICENSE)
[![evaluation](./doc/imgs/compass_support.svg)](https://github.com/internLM/OpenCompass/)
[![Documentation Status](https://readthedocs.org/projects/internlm/badge/?version=latest)](https://internlm.readthedocs.io/zh_CN/latest/?badge=latest)
[ðUsage](./doc/en/usage.md) |
[ð ï¸Installation](./doc/en/install.md) |
[ðTrain Performance](./doc/en/train_performance.md) |
[ðModel](#model-zoo) |
[ð¤HuggingFace](https://huggingface.co/internlm) |
[ðUpdate News](./CHANGE_LOG.md) |
[ð¤Reporting Issues](https://github.com/InternLM/InternLM/issues/new)
[English](./README.md) |
[ç®ä½ä¸æ](./README-zh-Hans.md) |
[æ¥æ¬èª](./README-ja-JP.md)
</div>
<p align="center">
ð join us on <a href="https://discord.gg/xa29JuW87d" target="_blank">Discord</a> and <a href="https://github.com/InternLM/InternLM/assets/25839884/a6aad896-7232-4220-ac84-9e070c2633ce" target="_blank">WeChat</a>
</p>
## Introduction
InternLM is an open-sourced lightweight training framework aims to support model pre-training without the need for extensive dependencies. With a single codebase, it supports pre-training on large-scale clusters with thousands of GPUs, and fine-tuning on a single GPU while achieving remarkable performance optimizations. InternLM achieves nearly 90% acceleration efficiency during training on 1024 GPUs.
Based on the InternLM training framework, we have released two open-sourced pretrained model InternLM-7B and InternLM-20B.
## News
[20230920] InternLM-20B is released with base and chat versions.
[20230822] InternLM-7B-Chat v1.1 is released with code interpreter and function calling capability. You can try it with [Lagent](https://github.com/InternLM/lagent).
## Model Zoo
Our models are released in three platforms: Transformers, ModelScope and OpenXLab.
| Model | Transformers | ModelScope | OpenXLab | Release Date |
|---------------------------|------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------|
| **InternLM Chat 20B** | [ð¤internlm/internlm-chat-20b](https://huggingface.co/internlm/internlm-20b-chat) | [<img src="./doc/imgs/modelscope_logo.png" width="20px" /> Shanghai_AI_Laboratory/internlm-chat-20b](https://modelscope.cn/models/Shanghai_AI_Laboratory/internlm-20b-chat/summary) | [![Open in OpenXLab](https://cdn-static.openxlab.org.cn/header/openxlab_models.svg)](https://openxlab.org.cn/models/detail/OpenLMLab/InternLM-chat-20b) | 2023-09-20 |
| **InternLM 20B** | [ð¤internlm/internlm-20b](https://huggingface.co/internlm/internlm-20b) | [<img src="./doc/imgs/modelscope_logo.png" width="20px" /> Shanghai_AI_Laboratory/internlm-20b](https://modelscope.cn/models/Shanghai_AI_Laboratory/internlm-20b/summary) | [![Open in OpenXLab](https://cdn-static.openxlab.org.cn/header/openxlab_models.svg)](https://openxlab.org.cn/models/detail/OpenLMLab/InternLM-20b) | 2023-09-20 |
| **InternLM Chat 7B v1.1** | [ð¤internlm/internlm-chat-7b-v1.1](https://huggingface.co/internlm/internlm-chat-7b-v1.1) | [<img src="./doc/imgs/modelscope_logo.png" width="20px" /> Shanghai_AI_Laboratory/internlm-chat-7b-v1_1](https://modelscope.cn/models/Shanghai_AI_Laboratory/internlm-chat-7b-v1_1/summary) | [![Open in OpenXLab](https://cdn-static.openxlab.org.cn/header/openxlab_models.svg)](https://openxlab.org.cn/models/detail/OpenLMLab/InternLM-chat-7b-v1.1) | 2023-08-22 |
| **InternLM 7B** | [ð¤internlm/internlm-7b](https://huggingface.co/internlm/internlm-7b) | [<img src="./doc/imgs/modelscope_logo.png" width="20px" /> Shanghai_AI_Laboratory/internlm-7b](https://modelscope.cn/models/Shanghai_AI_Laboratory/internlm-7b/summary) | [![Open in OpenXLab](https://cdn-static.openxlab.org.cn/header/openxlab_models.svg)](https://openxlab.org.cn/models/detail/OpenLMLab/InternLM-7b) | 2023-07-06 |
| **InternLM Chat 7B** | [ð¤internlm/internlm-chat-7b](https://huggingface.co/internlm/internlm-chat-7b) | [<img src="./doc/imgs/modelscope_logo.png" width="20px" /> Shanghai_AI_Laboratory/internlm-chat-7b](https://modelscope.cn/models/Shanghai_AI_Laboratory/internlm-chat-7b/summary) | [![Open in OpenXLab](https://cdn-static.openxlab.org.cn/header/openxlab_models.svg)](https://openxlab.org.cn/models/detail/OpenLMLab/InternLM-chat-7b) | 2023-07-06 |
| **InternLM Chat 7B 8k** | [ð¤internlm/internlm-chat-7b-8k](https://huggingface.co/internlm/internlm-chat-7b-8k) | [<img src="./doc/imgs/modelscope_logo.png" width="20px" /> Shanghai_AI_Laboratory/internlm-chat-7b-8k](https://modelscope.cn/models/Shanghai_AI_Laboratory/internlm-chat-7b-8k/summary) | [![Open in OpenXLab](https://cdn-static.openxlab.org.cn/header/openxlab_models.svg)](https://openxlab.org.cn/models/detail/OpenLMLab/InternLM-chat-7b-8k) | 2023-07-06 |
#### Introduction
InternLM-20B was pre-trained on over **2.3T** Tokens containing high-quality English, Chinese, and code data. Additionally, the Chat version has undergone SFT and RLHF training, enabling it to better and more securely meet users' needs.
In terms of model structure, InternLM-20B opted for a deeper architecture, with a depth set at 60 layers. This surpasses the conventional 7B and 13B models that utilize 32 or 40 layers. When parameters are limited, increasing the number of layers can enhance the model's overall capability. Furthermore, compared to InternLM-7B, the pre-training data used for InternLM-20B underwent higher quality cleansing and was supplemented with data rich in knowledge and designed for reinforcing understanding and reasoning capabilities. As a result, it exhibits significant improvements in understanding, reasoning, mathematical, and programming abilitiesâall of which test the technical proficiency of language models. Overall, InternLM-20B features the following characteristics:
- Outstanding overall performance
- Strong utility invocation capability
- Supports a 16k context length (Through inference extrapolation)
- Better value alignment.
#### Performance Evaluation
On the 5 capability dimensions proposed by OpenCompass, InternLM-20B has achieved excellent results (the bolded scores represent the best performances within the 13B-33B parameter range).
| Capability | Llama-13B | Llama2-13B | Baichuan2-13B | InternLM-20B | Llama-33B | Llama-65B | Llama2-70B |
|----------|-----------|------------|---------------|--------------|-----------|-----------|------------|
| Language | 42.5 | 47 | 47.5 | **55** | 44.6 | 47.1 | 51.6 |
| Knowledge | 58.2 | 58.3 | 48.9 | 60.1 | **64** | 66 | 67.7 |
| Understanding | 45.5 | 50.9 | 58.1 | **67.3** | 50.6 | 54.2 | 60.8 |
| Reasoning | 42.7 | 43.6 | 44.2 | **54.9** | 46.4 | 49.8 | 55 |
| Examination | 37.3 | 45.2 | 51.8
没有合适的资源?快使用搜索试试~ 我知道了~
温馨提示
个人深耕AI大模型应用领域积累的成果,希望对您有所帮助。有大模型账号、环境问题、AI大模型技术应用落地方案等相关问题,欢迎详聊,能为您解决问题是我的荣幸! 个人深耕AI大模型应用领域积累的成果,希望对您有所帮助。有大模型账号、环境问题、AI大模型技术应用落地方案等相关问题,欢迎详聊,能为您解决问题是我的荣幸! 个人深耕AI大模型应用领域积累的成果,希望对您有所帮助。有大模型账号、环境问题、AI大模型技术应用落地方案等相关问题,欢迎详聊,能为您解决问题是我的荣幸! 个人深耕AI大模型应用领域积累的成果,希望对您有所帮助。有大模型账号、环境问题、AI大模型技术应用落地方案等相关问题,欢迎详聊,能为您解决问题是我的荣幸! 个人深耕AI大模型应用领域积累的成果,希望对您有所帮助。有大模型账号、环境问题、AI大模型技术应用落地方案等相关问题,欢迎详聊,能为您解决问题是我的荣幸! 个人深耕AI大模型应用领域积累的成果,希望对您有所帮助。有大模型账号、环境问题、AI大模型技术应用落地方案等相关问题,欢迎详聊,能为您解决问题是我的荣幸!
资源推荐
资源详情
资源评论
收起资源包目录
书生浦语大模型实战营微调部署至OpenXlab.zip (242个子文件)
make.bat 804B
Dockerfile-centos 5KB
Dockerfile-centos 5KB
Dockerfile-ubuntu 5KB
Dockerfile-ubuntu 5KB
.gitignore 2KB
.gitmodules 231B
test_config.json 1KB
LICENSE 13KB
docker.Makefile 4KB
Makefile 638B
README.md 23KB
README-zh-Hans.md 21KB
usage.md 21KB
usage.md 20KB
README-ja-JP.md 17KB
README_EN.md 7KB
README.md 6KB
train_performance.md 6KB
train_performance.md 5KB
install.md 3KB
install.md 3KB
structure.md 3KB
structure.md 2KB
README-EN.md 2KB
README-CN.md 2KB
pull_request_template.md 2KB
README.md 809B
README-zh-Hans.md 761B
usage.md 71B
README.md 54B
install.md 33B
CHANGE_LOG.md 0B
V7_sft.model 1.58MB
params_memory_sunburst.png 477KB
train_performance.png 391KB
torch_profiler_trace.png 287KB
ckpt_path_format_CN.png 282KB
pack_into_one.png 279KB
pipeline_schedule.png 252KB
hybrid_parallel_training.png 213KB
flops.png 198KB
sequence_parallel.png 170KB
tensor_parallel.png 129KB
switch_transformer.png 59KB
tf32.png 47KB
dynamic_ntk_answer.png 32KB
modelscope_logo.png 6KB
robot.png 4KB
user.png 4KB
parallel.po 22KB
checkpoint.po 17KB
usage.po 16KB
monitor.po 11KB
moe.po 9KB
initialize.po 9KB
profiler.po 7KB
install.po 6KB
mixed_precision.po 5KB
training.po 5KB
index.po 2KB
7B_demo.po 2KB
20B_demo.po 2KB
index.po 836B
qa.po 713B
sonar-project.properties 68B
pipeline_scheduler.py 60KB
modeling_internlm.py 47KB
model_checkpoint.py 43KB
hybrid_zero_optim.py 41KB
storage_manager.py 38KB
inference.py 37KB
modeling_moe.py 30KB
utils.py 29KB
process_group_initializer.py 25KB
simple_memory_profiler.py 24KB
training_internlm.py 24KB
launch.py 23KB
parallel_context.py 23KB
modeling_internlm.py 23KB
p2p.py 22KB
multi_head_attention.py 18KB
packed_dataset.py 17KB
sharded_moe.py 16KB
batch_sampler.py 15KB
train_CI.py 14KB
test_model_checkpoint.py 13KB
metrics.py 13KB
test_optimizer.py 12KB
embedding.py 12KB
test_model_internlm.py 12KB
train.py 11KB
store.py 11KB
pal_inference.py 10KB
load_internlm_model.py 10KB
monitor.py 10KB
checkpoint.py 10KB
test_loss.py 9KB
gputest.py 9KB
tokenization_internlm.py 9KB
共 242 条
- 1
- 2
- 3
资源评论
季风泯灭的季节
- 粉丝: 2076
- 资源: 3370
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
- 【岗位说明】仓储物流部工作职责02.docx
- 【岗位说明】仓储物流部职责和岗位设置.doc
- 【岗位说明】仓储物流部人员配置及岗位职责说明03.docx
- 【岗位说明】仓库岗位工作职责.docx
- 【岗位说明】快递公司岗位职责.docx
- 【岗位说明】快递公司各岗位职责.doc
- 【岗位说明】物流部职责.docx
- 【岗位说明】物流部门及各岗位工作职责.doc
- 【岗位说明】公司各部门组织架构和岗位职责.doc
- 【岗位说明】外卖配送员制度与职责.docx
- 【岗位说明】物流仓管岗位职责说明书.docx
- 【岗位说明】物流仓储员岗位职责.doc
- 【岗位说明】物流仓管员岗位职责.doc
- 【岗位说明】物流岗位职责说明.docx
- 【岗位说明】物流岗位职责.doc
- 【岗位说明】物流仓储岗位职责.doc
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功