William Prew william.t.prew@durham.ac.uk
PGR Student Doctor of Philosophy
Evaluating Gaussian Grasp Maps for Generative Grasping Models
Prew, W.; Breckon, T.P.; Bordewich, M.J.R.; Beierholm, U.
Authors
Professor Toby Breckon toby.breckon@durham.ac.uk
Professor
Professor Magnus Bordewich m.j.r.bordewich@durham.ac.uk
Professor
Dr Ulrik Beierholm ulrik.beierholm@durham.ac.uk
Associate Professor
Abstract
Generalising robotic grasping to previously unseen objects is a key task in general robotic manipulation. The current method for training many antipodal generative grasping models rely on a binary ground truth grasp map generated from the centre thirds of correctly labelled grasp rectangles. However, these binary maps do not accurately reflect the positions in which a robotic arm can correctly grasp a given object. We propose a continuous Gaussian representation of annotated grasps to generate ground truth training data which achieves a higher success rate on a simulated robotic grasping benchmark. Three modern generative grasping networks are trained with either binary or Gaussian grasp maps, along with recent advancements from the robotic grasping literature, such as discretisation of grasp angles into bins and an attentional loss function. Despite negligible difference according to the standard rectangle metric, Gaussian maps better reproduce the training data and therefore improve success rates when tested on the same simulated robot arm by avoiding collisions with the object: achieving 87.94% accuracy. Furthermore, the best performing model is shown to operate with a high success rate when transferred to a real robotic arm, at high inference speeds, without the need for transfer learning. The system is then shown to be capable of performing grasps on an antagonistic physical object dataset benchmark.
Citation
Prew, W., Breckon, T., Bordewich, M., & Beierholm, U. (2022, July). Evaluating Gaussian Grasp Maps for Generative Grasping Models. Presented at Proc. Int. Joint Conf. Neural Networks, Padova, Italy
Presentation Conference Type | Conference Paper (published) |
---|---|
Conference Name | Proc. Int. Joint Conf. Neural Networks |
Start Date | Jul 18, 2022 |
End Date | Jul 23, 2022 |
Acceptance Date | Apr 26, 2022 |
Online Publication Date | Jul 18, 2022 |
Publication Date | 2022-07 |
Deposit Date | May 31, 2022 |
Publicly Available Date | Jun 6, 2022 |
Publisher | Institute of Electrical and Electronics Engineers |
Public URL | https://durham-repository.worktribe.com/output/1136847 |
Publisher URL | https://ieeexplore.ieee.org/xpl/conhome/1000500/all-proceedings |
Files
Accepted Conference Proceeding
(3.2 Mb)
PDF
Publisher Licence URL
http://creativecommons.org/licenses/by/4.0/
Copyright Statement
This work was funded by UKRI EPSRC. For the purpose of
open access, the authors have applied a Creative Commons
Attribution (CC BY) license to the Accepted Manuscript
version arising.
You might also like
Improving Robotic Grasping on Monocular Images Via Multi-Task Learning and Positional Loss
(-0001)
Presentation / Conference Contribution
Racial Bias within Face Recognition: A Survey
(2024)
Journal Article
Disentangling Racial Phenotypes: Fine-Grained Control of Race-related Facial Phenotype Characteristics
(2024)
Preprint / Working Paper
Progressively Select and Reject Pseudo-labelled Samples for Open-Set Domain Adaptation
(2024)
Journal Article
Downloadable Citations
About Durham Research Online (DRO)
Administrator e-mail: dro.admin@durham.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search