Projection mapping of behavioral expressions onto manufactured figures for speech interaction


Natural language user interfaces, such as Apple Siri and Google Voice Search have been embedded in consumer devices; however, speaking to objects can feel awkward. Use of these interfaces should feel natural, like speaking to a real listener. This paper proposes a method for manufactured objects such as anime figures to exhibit highly realistic behavioral expressions to improve speech interaction between a user and an object. Using a projection mapping technique, an anime figure provides back-channel feedback to a user by appearing to nod or shake its head. We developed a listener agent based on the anime figure that listens to a user give directions to a specific location. We performed experiments to investigate the users' impression of the speech interaction and compared it between four conditions. The experimental results suggested that the anime figure with projection mapping made the agent seem more realistic.


5 Figures and Tables

Download Full PDF Version (Non-Commercial Use)