Search
Now showing items 1-8 of 8
Deep compositional robotic planners that follow natural language commands
(Center for Brains, Minds and Machines (CBMM), Computation and Systems Neuroscience (Cosyne), 2020-05-31)
We demonstrate how a sampling-based robotic planner can be augmented to learn to understand a sequence of natural language commands in a continuous configuration space to move and manipu- late objects. Our approach combines ...
Encoding formulas as deep networks: Reinforcement learning for zero-shot execution of LTL formulas
(Center for Brains, Minds and Machines (CBMM), The Ninth International Conference on Learning Representations (ICLR), 2020-10-25)
We demonstrate a reinforcement learning agent which uses a compositional recurrent neural network that takes as input an LTL formula and determines satisfying actions. The input LTL formulas have never been seen before, ...
Social Interactions as Recursive MDPs
(Center for Brains, Minds and Machines (CBMM), Conference on Robot Learning (CoRL), 2021-11-08)
While machines and robots must interact with humans, providing them with social skills has been a largely overlooked topic. This is mostly a consequence of the fact that tasks such as navigation, command following, and ...
Compositional Networks Enable Systematic Generalization for Grounded Language Understanding
(Center for Brains, Minds and Machines (CBMM), Conference on Empirical Methods in Natural Language Processing (EMNLP), 2021-11-07)
Humans are remarkably flexible when under- standing new sentences that include combinations of concepts they have never encountered before. Recent work has shown that while deep networks can mimic some human language ...
Incorporating Rich Social Interactions Into MDPs
(Center for Brains, Minds and Machines (CBMM), International Conference on Robotics and Automation (ICRA), 2022-02-07)
Much of what we do as humans is engage socially with other agents, a skill that robots must also eventually possess. We demonstrate that a rich theory of social interactions originating from microso- ciology and economics ...
Learning a natural-language to LTL executable semantic parser for grounded robotics
(Center for Brains, Minds and Machines (CBMM), Conference on Robot Learning (CoRL), 2020-11-16)
Children acquire their native language with apparent ease by observing how language is used in context and attempting to use it themselves. They do so without laborious annotations, negative examples, or even direct ...
Compositional RL Agents That Follow Language Commands in Temporal Logic
(Center for Brains, Minds and Machines (CBMM), Frontiers in Robotics and AI, 2021-07-19)
We demonstrate how a reinforcement learning agent can use compositional recurrent neural net- works to learn to carry out commands specified in linear temporal logic (LTL). Our approach takes as input an LTL formula, ...
Trajectory Prediction with Linguistic Representations
(Center for Brains, Minds and Machines (CBMM), International Conference on Robotics and Automation (ICRA), 2022-03-09)
Language allows humans to build mental models that interpret what is happening around them resulting in more accurate long-term predictions. We present a novel trajectory prediction model that uses linguistic intermediate ...