Show simple item record

dc.contributor.authorStraub, Julian
dc.contributor.authorBhandari, Nishchal
dc.contributor.authorLeonard, John J
dc.contributor.authorFisher, John W
dc.date.accessioned2017-03-15T20:53:57Z
dc.date.available2017-03-15T20:53:57Z
dc.date.issued2016-01
dc.date.submitted2015-09
dc.identifier.isbn978-1-4799-9994-1
dc.identifier.urihttp://hdl.handle.net/1721.1/107428
dc.description.abstractDrift of the rotation estimate is a well known problem in visual odometry systems as it is the main source of positioning inaccuracy. We propose three novel algorithms to estimate the full 3D rotation to the surrounding Manhattan World (MW) in as short as 20 ms using surface-normals derived from the depth channel of a RGB-D camera. Importantly, this rotation estimate acts as a structure compass which can be used to estimate the bias of an odometry system, such as an inertial measurement unit (IMU), and thus remove its angular drift. We evaluate the run-time as well as the accuracy of the proposed algorithms on groundtruth data. They achieve zerodrift rotation estimation with RMSEs below 3.4° by themselves and below 2.8° when integrated with an IMU in a standard extended Kalman filter (EKF). Additional qualitative results show the accuracy in a large scale indoor environment as well as the ability to handle fast motion. Selected segmentations of scenes from the NYU depth dataset demonstrate the robustness of the inference algorithms to clutter and hint at the usefulness of the segmentation for further processing.en_US
dc.description.sponsorshipUnited States. Office of Naval Research. Multidisciplinary University Research Initiative6 (Awards N00014-11-1-0688 and N00014-10-1-0936)en_US
dc.description.sponsorshipNational Science Foundation (U.S.) (Award IIS-1318392)en_US
dc.language.isoen_US
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)en_US
dc.relation.isversionofhttp://dx.doi.org/10.1109/IROS.2015.7353628en_US
dc.rightsCreative Commons Attribution-Noncommercial-Share Alikeen_US
dc.rights.urihttp://creativecommons.org/licenses/by-nc-sa/4.0/en_US
dc.sourceMIT Web Domainen_US
dc.titleReal-time manhattan world rotation estimation in 3Den_US
dc.typeArticleen_US
dc.identifier.citationStraub, Julian et al. “Real-Time Manhattan World Rotation Estimation in 3D.” IEEE, 2015. 1913–1920.en_US
dc.contributor.departmentMassachusetts Institute of Technology. Computer Science and Artificial Intelligence Laboratoryen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Electrical Engineering and Computer Scienceen_US
dc.contributor.departmentMassachusetts Institute of Technology. Department of Mechanical Engineeringen_US
dc.contributor.mitauthorStraub, Julian
dc.contributor.mitauthorBhandari, Nishchal
dc.contributor.mitauthorLeonard, John J
dc.contributor.mitauthorFisher, John W
dc.relation.journalProceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)en_US
dc.eprint.versionAuthor's final manuscripten_US
dc.type.urihttp://purl.org/eprint/type/ConferencePaperen_US
eprint.statushttp://purl.org/eprint/status/NonPeerRevieweden_US
dspace.orderedauthorsStraub, Julian; Bhandari, Nishchal; Leonard, John J.; Fisher, John W.en_US
dspace.embargo.termsNen_US
dc.identifier.orcidhttps://orcid.org/0000-0003-2339-1262
dc.identifier.orcidhttps://orcid.org/0000-0002-8863-6550
dc.identifier.orcidhttps://orcid.org/0000-0003-4844-3495
mit.licenseOPEN_ACCESS_POLICYen_US
mit.metadata.statusComplete


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record