# Antipodal Robotic Grasping
We present a novel generative residual convolutional neural network based model architecture which detects objects in the camera’s field of view and predicts a suitable antipodal grasp configuration for the objects in the image.
This repository contains the implementation of the Generative Residual Convolutional Neural Network (GR-ConvNet) from the paper:
#### Antipodal Robotic Grasping using Generative Residual Convolutional Neural Network
Sulabh Kumra, Shirin Joshi, Ferat Sahin
[arxiv](https://arxiv.org/abs/1909.04810) | [video](https://youtu.be/cwlEhdoxY4U)
[![PWC](https://img.shields.io/endpoint.svg?url=https://paperswithcode.com/badge/antipodal-robotic-grasping-using-generative/robotic-grasping-on-cornell-grasp-dataset)](https://paperswithcode.com/sota/robotic-grasping-on-cornell-grasp-dataset?p=antipodal-robotic-grasping-using-generative)
If you use this project in your research or wish to refer to the baseline results published in the paper, please use the following BibTeX entry:
```
@inproceedings{kumra2020antipodal,
author={Kumra, Sulabh and Joshi, Shirin and Sahin, Ferat},
booktitle={2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)},
title={Antipodal Robotic Grasping using Generative Residual Convolutional Neural Network},
year={2020},
pages={9626-9633},
doi={10.1109/IROS45743.2020.9340777}}
}
```
## Requirements
- numpy
- opencv-python
- matplotlib
- scikit-image
- imageio
- torch
- torchvision
- torchsummary
- tensorboardX
- pyrealsense2
- Pillow
## Installation
- Checkout the robotic grasping package
```bash
$ git clone https://github.com/skumra/robotic-grasping.git
```
- Create a virtual environment
```bash
$ python3.6 -m venv --system-site-packages venv
```
- Source the virtual environment
```bash
$ source venv/bin/activate
```
- Install the requirements
```bash
$ cd robotic-grasping
$ pip install -r requirements.txt
```
## Datasets
This repository supports both the [Cornell Grasping Dataset](https://www.kaggle.com/oneoneliu/cornell-grasp) and
[Jacquard Dataset](https://jacquard.liris.cnrs.fr/).
#### Cornell Grasping Dataset
1. Download the and extract [Cornell Grasping Dataset](https://www.kaggle.com/oneoneliu/cornell-grasp).
2. Convert the PCD files to depth images by running `python -m utils.dataset_processing.generate_cornell_depth <Path To Dataset>`
#### Jacquard Dataset
1. Download and extract the [Jacquard Dataset](https://jacquard.liris.cnrs.fr/).
## Model Training
A model can be trained using the `train_network.py` script. Run `train_network.py --help` to see a full list of options.
Example for Cornell dataset:
```bash
python train_network.py --dataset cornell --dataset-path <Path To Dataset> --description training_cornell
```
Example for Jacquard dataset:
```bash
python train_network.py --dataset jacquard --dataset-path <Path To Dataset> --description training_jacquard --use-dropout 0 --input-size 300
```
## Model Evaluation
The trained network can be evaluated using the `evaluate.py` script. Run `evaluate.py --help` for a full set of options.
Example for Cornell dataset:
```bash
python evaluate.py --network <Path to Trained Network> --dataset cornell --dataset-path <Path to Dataset> --iou-eval
```
Example for Jacquard dataset:
```bash
python evaluate.py --network <Path to Trained Network> --dataset jacquard --dataset-path <Path to Dataset> --iou-eval --use-dropout 0 --input-size 300
```
## Run Tasks
A task can be executed using the relevant run script. All task scripts are named as `run_<task name>.py`. For example, to run the grasp generator run:
```bash
python run_grasp_generator.py
```
## Run on a Robot
To run the grasp generator with a robot, please use our ROS implementation for Baxter robot. It is available at: https://github.com/skumra/baxter-pnp
没有合适的资源?快使用搜索试试~ 我知道了~
使用 GR-ConvNet 的反足机器人抓取_python_代码_下载
共56个文件
py:31个
txt:5个
93:4个
1.该资源内容由用户上传,如若侵权请联系客服进行举报
2.虚拟产品一经售出概不退款(资源遇到问题,请及时私信上传者)
2.虚拟产品一经售出概不退款(资源遇到问题,请及时私信上传者)
版权申诉
0 下载量 30 浏览量
2022-06-20
12:36:33
上传
评论
收藏 65.91MB ZIP 举报
温馨提示
一种新颖的基于生成残差卷积神经网络的模型架构,该架构检测相机视野中的对象并预测图像中对象的合适对映抓取配置。 该存储库包含论文中生成残差卷积神经网络 (GR-ConvNet) 的实现: 使用生成残差卷积神经网络的对映机器人抓取 效果展示: https://www.youtube.com/watch?v=cwlEhdoxY4U 更多详情、使用方法,请下载后阅读README.md文件
资源推荐
资源详情
资源评论
收起资源包目录
robotic-grasping-master.zip (56个子文件)
robotic-grasping-master
_config.yml 26B
.gitignore 1KB
train_network.py 13KB
run_offline.py 3KB
README.md 4KB
utils
get_cornell.sh 305B
visualisation
plot.py 5KB
gridshow.py 2KB
get_jacquard.sh 206B
dataset_processing
image.py 7KB
evaluation.py 3KB
generate_cornell_depth.py 681B
grasp.py 14KB
data
grasp_data.py 3KB
camera_data.py 3KB
cornell_data.py 3KB
jacquard_data.py 2KB
__init__.py 366B
timeit.py 969B
run_calibration.py 377B
run_grasp_generator.py 279B
run_realtime.py 3KB
LICENSE 3KB
cleanup.sh 173B
inference
models
grconvnet.py 2KB
grconvnet2.py 3KB
grconvnet4.py 3KB
__init__.py 804B
grconvnet3.py 3KB
grasp_model.py 2KB
__init__.py 0B
grasp_generator.py 4KB
post_process.py 872B
evaluate.py 7KB
.gitattributes 240B
trained-models
jacquard-d-grconvnet3-drop0-ch32
epoch_44_iou_0.93 7.28MB
epoch_50_iou_0.94 7.28MB
arch.txt 3KB
epoch_48_iou_0.93 7.28MB
cornell-randsplit-rgbd-grconvnet3-drop1-ch32
epoch_19_iou_0.98 7.29MB
epoch_15_iou_0.97 7.29MB
epoch_13_iou_0.96 7.29MB
arch.txt 3KB
jacquard-rgbd-grconvnet3-drop0-ch32
epoch_42_iou_0.93 7.31MB
epoch_35_iou_0.92 7.31MB
arch.txt 3KB
epoch_48_iou_0.93 7.31MB
cornell-randsplit-rgbd-grconvnet3-drop1-ch16
epoch_20_iou_0.97 1.86MB
epoch_30_iou_0.97 1.86MB
epoch_17_iou_0.96 1.86MB
arch.txt 3KB
hardware
camera.py 2KB
__init__.py 0B
calibrate_camera.py 8KB
device.py 582B
requirements.txt 115B
共 56 条
- 1
资源评论
快撑死的鱼
- 粉丝: 1w+
- 资源: 9149
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功