<div align="center"><img src="assets/logo.png" width="350"></div>
<img src="assets/demo.png" >
## Introduction
YOLOX is an anchor-free version of YOLO, with a simpler design but better performance! It aims to bridge the gap between research and industrial communities.
For more details, please refer to our [report on Arxiv](https://arxiv.org/abs/2107.08430).
This repo is an implementation of PyTorch version YOLOX, there is also a [MegEngine implementation](https://github.com/MegEngine/YOLOX).
<img src="assets/git_fig.png" width="1000" >
## Updates!!
* 【2023/02/28】 We support assignment visualization tool, see doc [here](./docs/assignment_visualization.md).
* 【2022/04/14】 We support jit compile op.
* 【2021/08/19】 We optimize the training process with **2x** faster training and **~1%** higher performance! See [notes](docs/updates_note.md) for more details.
* 【2021/08/05】 We release [MegEngine version YOLOX](https://github.com/MegEngine/YOLOX).
* 【2021/07/28】 We fix the fatal error of [memory leak](https://github.com/Megvii-BaseDetection/YOLOX/issues/103)
* 【2021/07/26】 We now support [MegEngine](https://github.com/Megvii-BaseDetection/YOLOX/tree/main/demo/MegEngine) deployment.
* 【2021/07/20】 We have released our technical report on [Arxiv](https://arxiv.org/abs/2107.08430).
## Coming soon
- [ ] YOLOX-P6 and larger model.
- [ ] Objects365 pretrain.
- [ ] Transformer modules.
- [ ] More features in need.
## Benchmark
#### Standard Models.
|Model |size |mAP<sup>val<br>0.5:0.95 |mAP<sup>test<br>0.5:0.95 | Speed V100<br>(ms) | Params<br>(M) |FLOPs<br>(G)| weights |
| ------ |:---: | :---: | :---: |:---: |:---: | :---: | :----: |
|[YOLOX-s](./exps/default/yolox_s.py) |640 |40.5 |40.5 |9.8 |9.0 | 26.8 | [github](https://github.com/Megvii-BaseDetection/YOLOX/releases/download/0.1.1rc0/yolox_s.pth) |
|[YOLOX-m](./exps/default/yolox_m.py) |640 |46.9 |47.2 |12.3 |25.3 |73.8| [github](https://github.com/Megvii-BaseDetection/YOLOX/releases/download/0.1.1rc0/yolox_m.pth) |
|[YOLOX-l](./exps/default/yolox_l.py) |640 |49.7 |50.1 |14.5 |54.2| 155.6 | [github](https://github.com/Megvii-BaseDetection/YOLOX/releases/download/0.1.1rc0/yolox_l.pth) |
|[YOLOX-x](./exps/default/yolox_x.py) |640 |51.1 |**51.5** | 17.3 |99.1 |281.9 | [github](https://github.com/Megvii-BaseDetection/YOLOX/releases/download/0.1.1rc0/yolox_x.pth) |
|[YOLOX-Darknet53](./exps/default/yolov3.py) |640 | 47.7 | 48.0 | 11.1 |63.7 | 185.3 | [github](https://github.com/Megvii-BaseDetection/YOLOX/releases/download/0.1.1rc0/yolox_darknet.pth) |
<details>
<summary>Legacy models</summary>
|Model |size |mAP<sup>test<br>0.5:0.95 | Speed V100<br>(ms) | Params<br>(M) |FLOPs<br>(G)| weights |
| ------ |:---: | :---: |:---: |:---: | :---: | :----: |
|[YOLOX-s](./exps/default/yolox_s.py) |640 |39.6 |9.8 |9.0 | 26.8 | [onedrive](https://megvii-my.sharepoint.cn/:u:/g/personal/gezheng_megvii_com/EW62gmO2vnNNs5npxjzunVwB9p307qqygaCkXdTO88BLUg?e=NMTQYw)/[github](https://github.com/Megvii-BaseDetection/storage/releases/download/0.0.1/yolox_s.pth) |
|[YOLOX-m](./exps/default/yolox_m.py) |640 |46.4 |12.3 |25.3 |73.8| [onedrive](https://megvii-my.sharepoint.cn/:u:/g/personal/gezheng_megvii_com/ERMTP7VFqrVBrXKMU7Vl4TcBQs0SUeCT7kvc-JdIbej4tQ?e=1MDo9y)/[github](https://github.com/Megvii-BaseDetection/storage/releases/download/0.0.1/yolox_m.pth) |
|[YOLOX-l](./exps/default/yolox_l.py) |640 |50.0 |14.5 |54.2| 155.6 | [onedrive](https://megvii-my.sharepoint.cn/:u:/g/personal/gezheng_megvii_com/EWA8w_IEOzBKvuueBqfaZh0BeoG5sVzR-XYbOJO4YlOkRw?e=wHWOBE)/[github](https://github.com/Megvii-BaseDetection/storage/releases/download/0.0.1/yolox_l.pth) |
|[YOLOX-x](./exps/default/yolox_x.py) |640 |**51.2** | 17.3 |99.1 |281.9 | [onedrive](https://megvii-my.sharepoint.cn/:u:/g/personal/gezheng_megvii_com/EdgVPHBziOVBtGAXHfeHI5kBza0q9yyueMGdT0wXZfI1rQ?e=tABO5u)/[github](https://github.com/Megvii-BaseDetection/storage/releases/download/0.0.1/yolox_x.pth) |
|[YOLOX-Darknet53](./exps/default/yolov3.py) |640 | 47.4 | 11.1 |63.7 | 185.3 | [onedrive](https://megvii-my.sharepoint.cn/:u:/g/personal/gezheng_megvii_com/EZ-MV1r_fMFPkPrNjvbJEMoBLOLAnXH-XKEB77w8LhXL6Q?e=mf6wOc)/[github](https://github.com/Megvii-BaseDetection/storage/releases/download/0.0.1/yolox_darknet53.pth) |
</details>
#### Light Models.
|Model |size |mAP<sup>val<br>0.5:0.95 | Params<br>(M) |FLOPs<br>(G)| weights |
| ------ |:---: | :---: |:---: |:---: | :---: |
|[YOLOX-Nano](./exps/default/yolox_nano.py) |416 |25.8 | 0.91 |1.08 | [github](https://github.com/Megvii-BaseDetection/YOLOX/releases/download/0.1.1rc0/yolox_nano.pth) |
|[YOLOX-Tiny](./exps/default/yolox_tiny.py) |416 |32.8 | 5.06 |6.45 | [github](https://github.com/Megvii-BaseDetection/YOLOX/releases/download/0.1.1rc0/yolox_tiny.pth) |
<details>
<summary>Legacy models</summary>
|Model |size |mAP<sup>val<br>0.5:0.95 | Params<br>(M) |FLOPs<br>(G)| weights |
| ------ |:---: | :---: |:---: |:---: | :---: |
|[YOLOX-Nano](./exps/default/yolox_nano.py) |416 |25.3 | 0.91 |1.08 | [github](https://github.com/Megvii-BaseDetection/storage/releases/download/0.0.1/yolox_nano.pth) |
|[YOLOX-Tiny](./exps/default/yolox_tiny.py) |416 |32.8 | 5.06 |6.45 | [github](https://github.com/Megvii-BaseDetection/storage/releases/download/0.0.1/yolox_tiny_32dot8.pth) |
</details>
## Quick Start
<details>
<summary>Installation</summary>
Step1. Install YOLOX from source.
```shell
git clone git@github.com:Megvii-BaseDetection/YOLOX.git
cd YOLOX
pip3 install -v -e . # or python3 setup.py develop
```
</details>
<details>
<summary>Demo</summary>
Step1. Download a pretrained model from the benchmark table.
Step2. Use either -n or -f to specify your detector's config. For example:
```shell
python tools/demo.py image -n yolox-s -c /path/to/your/yolox_s.pth --path assets/dog.jpg --conf 0.25 --nms 0.45 --tsize 640 --save_result --device [cpu/gpu]
```
or
```shell
python tools/demo.py image -f exps/default/yolox_s.py -c /path/to/your/yolox_s.pth --path assets/dog.jpg --conf 0.25 --nms 0.45 --tsize 640 --save_result --device [cpu/gpu]
```
Demo for video:
```shell
python tools/demo.py video -n yolox-s -c /path/to/your/yolox_s.pth --path /path/to/your/video --conf 0.25 --nms 0.45 --tsize 640 --save_result --device [cpu/gpu]
```
</details>
<details>
<summary>Reproduce our results on COCO</summary>
Step1. Prepare COCO dataset
```shell
cd <YOLOX_HOME>
ln -s /path/to/your/COCO ./datasets/COCO
```
Step2. Reproduce our results on COCO by specifying -n:
```shell
python -m yolox.tools.train -n yolox-s -d 8 -b 64 --fp16 -o [--cache]
yolox-m
yolox-l
yolox-x
```
* -d: number of gpu devices
* -b: total batch size, the recommended number for -b is num-gpu * 8
* --fp16: mixed precision training
* --cache: caching imgs into RAM to accelarate training, which need large system RAM.
When using -f, the above commands are equivalent to:
```shell
python -m yolox.tools.train -f exps/default/yolox_s.py -d 8 -b 64 --fp16 -o [--cache]
exps/default/yolox_m.py
exps/default/yolox_l.py
exps/default/yolox_x.py
```
**Multi Machine Training**
We also support multi-nodes training. Just add the following args:
* --num\_machines: num of your total training nodes
* --machine\_rank: specify the rank of each node
Suppose you want to train YOLOX on 2 machines, and your master machines's IP is 123.123.123.123, use port 12312 and TCP.
On master machine, run
```shell
python tools/train.py -n yolox-s -b 128 --dist-url tcp://123.123.123.123:12312 --num_machines 2 --machine_rank 0
```
On the second machine, run
```shell
python tools/train.py -n yolox-s -b 128 --dist-url tcp://123.12
没有合适的资源?快使用搜索试试~ 我知道了~
目标检测模型-YOLOvX-Pytorch版本代码
共164个文件
py:90个
md:22个
xml:8个
0 下载量 116 浏览量
2024-05-13
21:40:17
上传
评论
收藏 4.28MB ZIP 举报
温馨提示
实时性能:YOLO系列模型以实时目标检测而闻名,适用于需要快速响应的应用场景。 单次预测:YOLO的核心特性是单次前向传播即可预测图像中的目标,这与传统的两阶段检测器(如Faster R-CNN)不同。 端到端训练:YOLO模型可以直接从图像到边界框和类别概率进行端到端的训练。 多尺度预测:YOLO通常在多个尺度上进行预测,能够检测不同大小的目标。 泛化能力:YOLO模型通过在大规模数据集(如COCO和PASCAL VOC)上训练,具有良好的泛化能力。 易于部署:YOLO模型由于其速度和性能的平衡,易于部署在各种计算平台上,包括边缘设备。 持续改进:YOLO系列模型随着版本迭代不断改进,包括检测速度、准确率、模型大小等。 社区支持:YOLO模型有着活跃的社区支持,许多研究者和开发者贡献了代码、教程和改进。 多种实现:YOLO模型有多种实现,包括官方实现和社区贡献的版本,支持不同的深度学习框架,如TensorFlow、PyTorch等。
资源推荐
资源详情
资源评论
收起资源包目录
目标检测模型-YOLOvX-Pytorch版本代码 (164个子文件)
gradlew.bat 2KB
setup.cfg 618B
cocoeval.cpp 20KB
yolox_openvino.cpp 18KB
yolox.cpp 17KB
yolox.cpp 16KB
yoloXncnn_jni.cpp 14KB
yolox.cpp 13KB
custom.css 556B
.gitignore 3KB
.gitignore 184B
.gitignore 6B
build.gradle 496B
build.gradle 335B
settings.gradle 15B
gradlew 5KB
logging.h 16KB
cocoeval.h 4KB
YOLOX-main.iml 452B
MANIFEST.in 75B
gradle-wrapper.jar 53KB
MainActivity.java 8KB
YOLOXncnn.java 566B
dog.jpg 160KB
LICENSE 11KB
Makefile 629B
README.md 13KB
train_custom_data.md 7KB
README.md 6KB
model_zoo.md 5KB
README.md 5KB
cache.md 5KB
README.md 4KB
quick_run.md 4KB
README.md 3KB
README.md 3KB
README.md 3KB
updates_note.md 2KB
manipulate_training_image_size.md 2KB
assignment_visualization.md 1KB
README.md 1KB
README.md 1KB
README.md 1KB
freeze_module.md 1KB
README.md 868B
README.md 767B
README.md 251B
README.md 68B
yolox.param 17KB
demo.png 2.09MB
sunjian.png 1.04MB
assignment.png 650KB
git_fig.png 390KB
logo.png 15KB
gradle-wrapper.properties 232B
yolo_head.py 23KB
logger.py 14KB
trainer.py 13KB
yolox_base.py 13KB
conf.py 13KB
voc.py 12KB
coco_evaluator.py 11KB
datasets_wrapper.py 11KB
demo.py 10KB
mosaicdetection.py 9KB
demo.py 8KB
dist.py 8KB
data_augment.py 7KB
voc_evaluator.py 6KB
lr_scheduler.py 6KB
yolo_head.py 6KB
eval.py 6KB
coco.py 6KB
network_blocks.py 6KB
darknet.py 6KB
network_blocks.py 6KB
fast_coco_eval_api.py 6KB
darknet.py 6KB
openvino_inference.py 6KB
voc_eval.py 5KB
model_utils.py 5KB
demo_utils.py 5KB
boxes.py 5KB
jit_ops.py 4KB
launch.py 4KB
build.py 4KB
train.py 4KB
dataloading.py 4KB
export_onnx.py 4KB
visualize.py 4KB
yolo_pafpn.py 3KB
yolo_pafpn.py 3KB
test_model_utils.py 3KB
metric.py 3KB
samplers.py 3KB
allreduce_norm.py 3KB
setup.py 3KB
visualize_assign.py 3KB
setup_env.py 3KB
base_exp.py 3KB
共 164 条
- 1
- 2
资源评论
张飞飞飞飞飞
- 粉丝: 2923
- 资源: 28
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功