PhD Defense: Toward Symbiotic Human-AI Interaction Focusing On Programming By Example

Talk
Tak Yeon Lee
Time: 
04.05.2017 14:00 to 16:00
Location: 

AVW 4424

Recent advancements of Artificial Intelligence (AI) allow humans to do a wide range tasks in collaboration with automated systems. Canonical scenarios include programming-by-example (PBE), where human users provide input and output examples to teach computers how to perform certain tasks. Until recently, the progress of PBE has been mostly driven by advances of learning algorithms; however, a growing community of researchers at the intersection of AI and Human-Computer Interaction(HCI) are realizing that there exist a lot of challenge and open research questions on the human side. For instance, humans tend to provide insufficient and/or inconsistent input that will fail the PBE engine to learn the correct solution. When teaching has failed, users would want to know why it failed and how to fix. Understanding how humans actually interact with AI is critical to designing usable systems. However, as most AI systems do not have mechanisms to communicate its capability and limitations, users often lose their trust on the AI systems.

As a step toward symbiotic interaction between human and AI, this dissertation has four research threads. First, we begin with two formative studies to establish a better understanding of inexperienced end-users’ needs and mental model. The study uncovered a number of patterns of user behavior, and suggested design implications for future PBE systems. Second, I developed an end-user programming environment, called VESPY, that interleaves visual programming and PBE techniques.

Using VESPY, end-users can create interactive web components by (1) decomposing tasks into small modules, and (2) completing each module by providing input and output examples. Despite its versatility, we observed that usability issues of PBE still exist. For example, since users do not little knowledge about the PBE engine, they would make a wide range of mistakes. In response, we conducted an online user study that investigates to what extent inexperienced users can preform two core activities of PBE: problem decomposition and disambiguation. The results confirm that the two activities are challenging, and identified seven types of common mistakes they made. Finally, we explored the design space of feedback mechanism to find the best feedback design. The results suggest that providing both system information and detailed instruction is beneficial, but there is a risk of overloading users with too much information.

My dissertation contributes to the AI and HCI communities with: (i) identification of unmet needs of end-users of the Web; (ii) characterization of non-programmers’ mental model; (iii) design process of interleaving visual programming and PBE; (iv) identification of mistakes people make while using PBE; and (v) design and assessment of feedback for PBE users.

Examining Committee:

Chair: Dr. Ben Bederson

Dean's rep: Dr. Jennifer Golbeck

Members: Dr. Jeff Foster

Dr. Jon Froehlich

Dr. Leah Findlater