More Than a Feeling: Learning to Grasp and Regrasp Using Vision and Touch
Top Cited Papers
- 4 July 2018
- journal article
- research article
- Published by Institute of Electrical and Electronics Engineers (IEEE) in IEEE Robotics and Automation Letters
- Vol. 3 (4), 3300-3307
- https://doi.org/10.1109/lra.2018.2852779
Abstract
For humans, the process of grasping an object relies heavily on rich tactile feedback. Most recent robotic grasping work, however, has been based only on visual input, and thus cannot easily benefit from feedback after initiating contact. In this letter, we investigate how a robot can learn to use tactile information to iteratively and efficiently adjust its grasp. To this end, we propose an end-to-end action-conditional model that learns regrasping policies from raw visuo-tactile data. This model - a deep, multimodal convolutional network - predicts the outcome of a candidate grasp adjustment, and then executes a grasp by iteratively selecting the most promising actions. Our approach requires neither calibration of the tactile sensors nor any analytical modeling of contact forces, thus reducing the engineering effort required to obtain efficient grasping policies. We train our model with data from about 6450 grasping trials on a two-finger gripper equipped with GelSight high-resolution tactile sensors on each finger. Across extensive experiments, our approach outperforms a variety of baselines at 1) estimating grasp adjustment outcomes, 2) selecting efficient grasp adjustments for quick grasping, and 3) reducing the amount of force applied at the fingers, while maintaining competitive performance. Finally, we study the choices made by our model and show that it has successfully acquired useful and interpretable grasping behaviors.Keywords
Funding Information
- Berkeley DeepDrive
- Nvidia
- Amazon
- Toyota Research Institute
- MIT Lincoln Labs
This publication has 33 references indexed in Scilit:
- Leveraging big data for grasp planningPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2015
- Measurement of shear and slip with a GelSight tactile sensorPublished by Institute of Electrical and Electronics Engineers (IEEE) ,2015
- Deep learning for detecting robotic graspsThe International Journal of Robotics Research, 2015
- Control Framework for Dexterous Manipulation Using Dynamic Visual Servoing and Tactile Sensors’ FeedbackSensors, 2014
- Stable grasping under pose uncertainty using tactile feedbackAutonomous Robots, 2013
- From caging to graspingThe International Journal of Robotics Research, 2012
- Tactile sensing for dexterous in-hand manipulation in robotics—A reviewSensors and Actuators A: Physical, 2011
- Data-driven graspingAutonomous Robots, 2011
- Coding and use of tactile signals from the fingertips in object manipulation tasksNature Reviews Neuroscience, 2009
- Robot Grasp Synthesis Algorithms: A SurveyThe International Journal of Robotics Research, 1996