Moving Towards Natural Human-Machine Collaborations
The ability to make increasingly smaller computational devices and advancements in input and interaction technology are revolutionizing the way users interact with technology through new natural user interfaces (NUIs). These interfaces provide natural ways of interacting with technology using touch, speech, gestures, handwriting, and vision. The goal of these interfaces is to ease the discovery and barriers caused by interfaces such that the computing technology acts like a natural and dynamic partner and less of a tool. However, interaction with NUIs continues to mimic the traditional point-and-click paradigm of desktop computers, and thus, reinforces the idea that technology is a tool and not a partner. In this talk, I will present my vision of the future of human-machine interactions, and then discuss my lab’s research projects that aim to see this vision become a reality. In particular, I will discuss our research in understanding how human-human nonverbal communication (e.g., gesture, gaze, and facial expressions) can be leveraged to create natural multimodal interfaces to aid in developing collaborations between humans and technology. Lastly, I will end by describing a new DARPA-funded project that aims to develop an Augmented Reality computational partner to assist with various sequential tasks.