Rethinking Language Interaction With Pervasive Applications and Devices
Abstract
Language-based interaction with pervasive devices today mostly follows a basic command-and-control paradigm, reminiscent of 1960s Star Trek, which is often cumbersome, inadequate, and insufficient. Key causes include an often deceptive portrayal of the language comprehension capabilities of the system and, in many cases, a problematic impersonation of human characters by computers. We argue here that a major challenge of pervasive computing is to rethink language-based interaction with applications and devices. Inspired by the imaginary pervasive conversations of Sci-Fi movies, we suggest a new design principle where machines should only utter statements that they can comprehend, which we call What You Hear Is What You Say (WYHIWYS). We provide some examples of WYHIWYS in language interactions with pervasive applications, including a deployment in an art exhibit, and discuss some key research challenges it poses to HCI, AI, and pervasive computing.