Language Grounding with Robots

Talk
Jesse Thomason
Talk Series: 
Time: 
04.01.2020 11:00 to 12:00

We use language to refer to objects like “toast”, “plate”, and “table”and to communicate requests such as “Could you make breakfast?” In this talk, I will present work on computational methods to tie language to physical, grounded meaning. Robots are an ideal platform for such work because they can perceive and interact with the world. I will discuss dialog and learning strategies I have developed to enable robots to learn from their human partners, similar to how people learn from one another through interaction. I will present methods enabling robots to understand language referring expressions like “the heavy,metallic mug”, the first work showing that it is possible to learn to connect words to their perceptual properties in the visual, tactile, and auditory senses of a physical robot. I will also present benchmarks and models for translating high-level human language like “put the toast on the table” that imply latent, intermediate goals into executable sequences of agent actions with the help of low-level, step-by-step language instructions. Finally, I will discuss how my work in grounded language contributes to NLP, robotics, and the broader goals of the AI community.