Skip to content

Deep Robotic Grasping using Generative Residual Convolutional Neural Networks that can generate robust antipodal grasps from RGB-D input images at real-time

Notifications You must be signed in to change notification settings

Loahit5101/GR-ConvNet-grasping

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

14 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Deep Robotic Grasping using Generative Residual Convolutional Neural Network

A CNN (GR-ConvNet) model that can generate robust antipodal grasps from RGB-D input images at real-time alt text

Setup

  • Clone this repository and install required libraries
git clone https://github.com/Loahit5101/GR-ConvNet-grasping.git
cd robotic-grasping
pip install -r requirements.txt

Architecture

alt text

Dataset

Download and extract the Cornell Grasping dataset and run the following command: Cornell Grasping Dataset

python -m utils.dataset_processing.generate_cornell_depth <Path To Dataset>

Pretrained models

Trained models are available here

Training

python train.py

Testing

python test.py

Model Optimization using TensorRT

  • Post-training quantization
python ptq.py
  • Benchmarking grasp inference time of optimized and unoptimized models
python trt_benchmark.py

Results

alt text

Average Grasp Inference Time

Model Time (ms) Accuracy
Baseline 4.59 95.5
FP-32 3.71 94.2
FP-16 1.45 93.16

Post-processing time = 14 ms on average

References

  1. Antipodal Robotic Grasping using GR-ConvNet
  2. Pytorch-TensorRT Tutorials

About

Deep Robotic Grasping using Generative Residual Convolutional Neural Networks that can generate robust antipodal grasps from RGB-D input images at real-time

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published