ProtoKinetic: A Multimodal Design Tool for Rapid and Real-Time Prototyping of Human-Robot Interactions

PI: Sean Follmer


In the coming era of ubiquitous robotics we envision the need for the effortless design of contextually-aware interactions with robots. Existing tools for robot programming are designed for expert programmers and do not leverage design methodologies such as iterative design, Wizard of Oz (WoZ) prototyping, and user testing. Ubiquitous robots create a number of challenges for tool designers. Firstly, due to their dynamic nature, prototyping requires skillful programming and is often time consuming. It is difficult to apply WoZ methods as it is difficult to control multiple simultaneously moving components. Moreover, these devices are often context-aware and their behavior is affected by people, objects, and their environment. To address these challenges, we investigate the use of a multimodal interface (augmented reality, voice, and direct manipulation) for situated WoZ prototyping. These WoZ prototypes provide an opportunity to help designers iteratively transition to fully autonomous systems, by learning from interactions performed during user evaluations. We hope to build a tool that allows iterative replacement of WoZ elements with automation, and enables designers to better log and analyze situated interactions. Finally, we plan to evaluate its use to understand the benefits of these methods through a series of workshops with interaction designers.