Show simple item record

dc.contributor.authorBauza Villalonga, Maria
dc.contributor.authorAlet, Ferran
dc.contributor.authorYen-Chen, Lin
dc.contributor.authorLozano-Pérez, Tomás
dc.contributor.authorKaelbling, Leslie P
dc.contributor.authorIsola, Phillip John
dc.contributor.authorRodriguez, Alberto
dc.date.accessioned2021-02-16T20:11:22Z
dc.date.available2021-02-16T20:11:22Z
dc.date.issued2019-11
dc.date.submitted2019-10
dc.identifier.isbn9781728140049
dc.identifier.urihttps://hdl.handle.net/1721.1/129775
dc.description.abstractPushing is a fundamental robotic skill. Existing work has shown how to exploit models of pushing to achieve a variety of tasks, including grasping under uncertainty, in-hand manipulation and clearing clutter. Such models, however, are approximate, which limits their applicability. Learning-based methods can reason directly from raw sensory data with accuracy, and have the potential to generalize to a wider diversity of scenarios. However, developing and testing such methods requires rich-enough datasets. In this paper we introduce Omnipush, a dataset with high variety of planar pushing behavior.In particular, we provide 250 pushes for each of 250 objects, all recorded with RGB-D and a high precision tracking system. The objects are constructed so as to systematically explore key factors that affect pushing-The shape of the object and its mass distribution-which have not been broadly explored in previous datasets, and allow to study generalization in model learning. Omnipush includes a benchmark for meta-learning dynamic models, which requires algorithms that make good predictions and estimate their own uncertainty. We also provide an RGB video prediction benchmark and propose other relevant tasks that can be suited with this dataset. Data and code are available at https://web.mit.edu/mcube/omnipush-dataset/.en_US
dc.language.isoen
dc.publisherIEEEen_US
dc.relation.isversionof10.1109/IROS40897.2019.8967920en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourcearXiven_US
dc.titleOmnipush: accurate, diverse, real-world dataset of pushing dynamics with RGB-D videoen_US
dc.typeArticleen_US
dc.identifier.citationBauza, Maria et al. "Omnipush: accurate, diverse, real-world dataset of pushing dynamics with RGB-D video" IEEE International Conference on Intelligent Robots and Systems, November 2019, Macau, China, Institute of Electrical and Electronics Engineering © 2019 IEEE.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mechanical Engineeringen_US
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratoryen_US
dc.relation.journalIEEE International Conference on Intelligent Robots and Systemsen_US
dc.eprint.versionOriginal manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dc.date.updated2020-08-03T13:45:09Z
dspace.date.submission2020-08-03T13:45:11Z
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record