A command line style gestural language
for the internet of things
The internet of things is taking shape and the need for platform independent personal control structures becomes more apparent every day. We amass collections of remotes, press buttons, manipulate GUIs, and use gestures to pull, pinch and sort information; still, we have no direct remote control gestural language with which to command the physical reality of our world. Many systems have attempted to limit the number of gestures in their interface, citing that users have trouble remembering the movements. I believe that this is a symptom of designing gestural systems to only mimic physical manipulation. However, these issues can be overcome by creating an interaction model that utilizes the language structure inherent in the Unix command line and keyboard short cuts. Through collaborative efforts, we can move beyond individual gestures to an applied gestural language that could be a powerful query and control interface for the internet of things.
I believe it is possible to extend the reach and precision of the gestural interface beyond current models by investigating the parallels between signed language processes and those of computer language systems. This idea was the foundation for my Fab Academy final project, currently titled "Puppet Master". The goal for this project was to experiment with a universal remote control system and has since developed into a strong interest in modeling a signed language that can be used to query and manipulate physical objects.
The Puppet Master project was inspired by the anime series Naruto in which a specific class of characters use handsigns to control a variety of marionette-like objects. This fictional interaction model led me to explore how specific gestural commands could be used as control shortcuts. The keyboard shortcut model applied to full gestural language could provide a controlled vocabulary that utilizes signed language for computer control, resulting in platform independent "physical shortcuts". Additionally, these physical shortcuts could be quicker and more succinct than the type of gestures used by the majority of gestural interfaces.
Over time, the same muscle memory that makes touch typing and keyboard shortcuts efficient will begin to take over and the more the muscles remember, the less the user thinks about handsign production. This style of interaction could be analogous to a command line interface, much like a gestural Unix for the internet of things. Thus making it possible to query, inventory and control bounded environments. Many Unix concepts could be applicable to this interaction model, such as redirection of content [from one networked physical object to another] and user customization for specific contextual needs.