Reactive Human-to-Robot Handovers of Arbitrary Objects

W. Yang, C. Paxton, A. Mousavian, Y.-W. Chao, M. Cakmak, and D. Fox, “Reactive Human-to-Robot Handovers of Arbitrary Objects,” 2021, doi: 10.1109/ICRA48506.2021.9561170.

Abstract

Human-robot object handovers have been an actively studied area of robotics over the past decade; however, very few techniques and systems have addressed the challenge of handing over diverse objects with arbitrary appearance, size, shape, and deformability. In this paper, we present a vision-based system that enables reactive human-to-robot handovers of unknown objects. Our approach combines closed-loop motion planning with real-time, temporally consistent grasp generation to ensure reactivity and motion smoothness. Our system is robust to different object positions and orientations, and can grasp both rigid and non-rigid objects. We demonstrate the generalizability, usability, and robustness of our approach on a novel benchmark set of 26 diverse household objects, a user study with six participants handing over a subset of 15 objects, and a systematic evaluation examining different ways of handing objects.

BibTeX Entry

@inproceedings{yang2021icra,
  title = {Reactive Human-to-Robot Handovers of Arbitrary Objects},
  author = {Yang, Wei and Paxton, Chris and Mousavian, Arsalan and Chao, Yu-Wei and Cakmak, Maya and Fox, Dieter},
  year = {2021},
  booktitle = {IEEE International Conference on Robotics and Automation (ICRA)},
  note = {Best HRI Paper Award Winner},
  type = {conference},
  doi = {10.1109/ICRA48506.2021.9561170}
}