# Deep-Stream-ONNX
## Setup
### Step 1: Setting up Jetson Nano and DeepStream.
- Setup your Jetson Nano and to install the DeepStream SDK.
### Step 2: Download the project
### Step 3: Download the Tiny YOLOv2 ONNX model.
- Download the Tiny YOLOv2 ONNX model from the [ONNX Model Zoo](https://github.com/onnx/models). We used this [model](https://onnxzoo.blob.core.windows.net/models/opset_8/tiny_yolov2/tiny_yolov2.tar.gz) in our experiments.
### Step 4: Compiling the custom bounding box parser.
- A custom bounding box parser function is written in `nvdsparsebbox_tiny_yolo.cpp` inside the `custom_bbox_parser` folder.
- A `Makefile` is configured to compile the custom bounding box parsing function into a shared library (.so) file. It is also available inside the same folder.
- The below variables may need to be set by the user in the `Makefile` before compiling:
```makefile
# Set the CUDA version.
CUDA_VER:=10
# Name of the file with the custom bounding box parser function.
SRCFILES:=nvdsparsebbox_tiny_yolo.cpp
# Name of the shared library file to be created after compilation.
TARGET_LIB:=libnvdsinfer_custom_bbox_tiny_yolo.so
# Path to the DeepStream SDK. REPLACE /path/to with the location in your Jetson Nano.
DEEPSTREAM_PATH:=/path/to/deepstream_sdk_v4.0_jetson
```
> Note: If no changes were made to the code by the user, and the blog was followed to set up Jetson Nano and DeepStream, then only the **DEEPSTREAM_PATH** variable may need to be set before compilation. Default values can be used for the other three variables.
- Once the variables are set, save the `Makefile`. Compile the custom bounding box parsing function using: `make -C custom_bbox_parser`.
### Step 5: Launching DeepStream.
- Download the `sample.tar.gz` from this [drive link](https://drive.google.com/open?id=1kZERLw2y9ig9nVwvTPrFOrI5VOTri3d7). Extract the `vids` directory into the `Deep-Stream-ONNX` directory.
- You can launch DeepStream using the following command:
```bash
deepstream-app -c ./config/deepstream_app_custom_yolo.txt
```
- You can edit the config files inside the `config` to alter various settings. You can refer to the [blog](https://towardsdatascience.com/how-to-deploy-onnx-models-on-nvidia-jetson-nano-using-deepstream-b2872b99a031) for resources on understanding the various properties inside the config files.
没有合适的资源?快使用搜索试试~ 我知道了~
算法部署-在Jetson-Nano上使用deepstream部署ONNX模型-附项目源码-优质项目实战.zip
共6个文件
txt:3个
md:1个
makefile:1个
1.该资源内容由用户上传,如若侵权请联系客服进行举报
2.虚拟产品一经售出概不退款(资源遇到问题,请及时私信上传者)
2.虚拟产品一经售出概不退款(资源遇到问题,请及时私信上传者)
版权申诉
0 下载量 155 浏览量
2024-05-08
14:41:50
上传
评论
收藏 11KB ZIP 举报
温馨提示
算法部署_在Jetson-Nano上使用deepstream部署ONNX模型_附项目源码_优质项目实战
资源推荐
资源详情
资源评论
收起资源包目录
算法部署_在Jetson-Nano上使用deepstream部署ONNX模型_附项目源码_优质项目实战.zip (6个子文件)
算法部署_在Jetson-Nano上使用deepstream部署ONNX模型_附项目源码_优质项目实战
custom_bbox_parser
Makefile 591B
nvdsparsebbox_tiny_yolo.cpp 18KB
labels.txt 136B
README.md 2KB
config
config_infer_custom_yolo.txt 2KB
deepstream_app_custom_yolo.txt 3KB
共 6 条
- 1
资源评论
__AtYou__
- 粉丝: 1657
- 资源: 519
下载权益
C知道特权
VIP文章
课程特权
开通VIP
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功