User login

Powered by Drupal, an open source content management system

Theory and Practice of Tangible User Interfaces

Using Our Movement in Space for Implicit Communication

Submitted by andy on Thu, 10/30/2008 - 22:05

Assignment: Final Project 1: Progress Report

Collaborators:

PROGRESS REPORT

The feedback for the SmrtClips idea was generally positive and we have been exploring how we could actually implement it.  We have sketched out the possibility of using RFID tags and LEDs in the clips, and an RFID reader mounted to the computer or tray, but the small form factor is proving to be tricky.  We also have been considering larger form objects, like say, a clipboard, that has more surface area and thus more interaction points.  This changes our concept somewhat given that a clipboard is typically used for one document at a time.  For example, because SmrtClips were small and plentiful, one could put them on all their offline documents, thus facilitating finding that document in a messy pile on a desk or in a file cabinet.  This use case wouldn't be feasible with a clipboard.  But a larger form factor like a clipboard, again has more interaction points and could offer more opportunity for communication through the interactions, which we have realized is one of the aspects of the clip idea that we are the most interested in.  We considered handling and fidgeting with clips to capture implicit interaction - letting others know that you're reading a document - and this works as well with a clipboard, whether it's tapping one's fingers against it, rubbing one corner or opening and closing the clip at the top.

We felt like we really hit on something with capturing the unintentional, fidgeting behavior to send a message or affect something.  So in addition to the clip/clipboard concept, we have spent a lot of time thinking about fidgeting in general.  We did some ad hoc observational studies at Free Speech Movement cafe and in our classes, and fidgeting was all around us.  People bouncing their leg, twirling their hair, tapping pens, etc.  So how can we use that behavior to communicate something to or from the person fidgeting?  Or a collective group's fidgeting behavior?  Could it trigger something to soothe the person?  Or could we harness it to create something positive, like say using the kinetic energy to create power?  What if the behavior influenced their environment in a more ambient, subtle way, by creating art or music?

We all latched onto the latter thought and have been exploring that further.  We observed people in the cafe using their small tables in many different ways... bouncing their feet against it, tapping a pen on it, leaning on it, setting a book or coffee cup on it... sometimes there was just one person on one side of the table, sometimes there were two people, one on either side. Sometimes they were engaged in conversation and leaning towards each other with their elbows on the table to create a more intimate environment, whereas others were hunched over computers or curled over a book, maybe leaning on one elbow or just resting the book on the table. Some had items splayed territorially all over the table, whereas others had a minimally controlled space with one book or coffee cup in front of them.  These observations led us to consider what meaning there might be in the placement of things on the table and the pressure of each thing and we were encouraged to hear Wendy Ju's example of using space for implicit communication (like placement of coffee cups at a diner).

With some ideas of possible inputs, we're now looking at possible outputs, including sound and light. We might be able to use this implicit communication to control music: a heavy book may have a lower or longer sound, whereas someone tapping their pen or fingers could create short repetitive notes or affect the beat. Moving things around on the table could alter the sound being produced as well.  While much work has been done with users actively engaging specific objects on a table to compose music (see below), we were thinking our idea would have a broader metaphor. We envision more passive music composition, and less of a performance orientation. In Wendy Ju's implicit interaction framework, our system would be background rather than foreground (like the related work mentioned below) and peripheral rather than focused performance.  We've also tried to take Hayes Raffle's advice that our system shouldn't focus on designing strict mappings between input variables and music variables, but leave some flexibility for the user.  The music produced would be ambient and ephemeral so that people could subtly sense the existence and energy of that particular instance.

Instead of, or perhaps complement sound output, we're also looking at light as another output. The placement and position of objects on the table would provide visual feedback on the table. Additionally, what if lighted outlines, like halos appeared around the things on the table and reflected their force or weight, creating a glow around those things making contact with the table.  And perhaps these outlines/rings could persist for some amount of time to show a history of the person at the table before you. These lighting outputs could move beyond the single table display, and also/instead alter the lighting in the larger environment. Interacting with the table would subtly alter the lighting in the room, or perhaps when combined with input from other tables it would create a light-based composition on the wall or ceiling. This composition could be generated in real-time as people interact with the space, or perhaps saved for display later in the day. Combining music and light with multiple tables you could see the energy in the room reflected in each moment as the music and lighting in the room changed. We could see this installing such a system not only in a cafe, but maybe a museum, bar, lounge/club, airplane boarding area, or even your kitchen table.

We are still working on fleshing the idea out, as well as considering the larger form clip idea as well.  We hope to meet with Kimiko early next week to talk through the idea(s) more in depth and get some feedback on a path.

One thing to note:  After seeing Hayes Raffle's sound board project, we were really inspired to approach our project ideas in a more abstract, artistic way and see if we can come up with something "outside the (our) box".  Whereas we have taken a more business/engineering approach before - what finite, specific problem can we solve...we have been exploring how to represent behaviors through music or art.

RELATED WORK

(much sourced from Reactable - http://reactable.iua.upf.edu/?related)

AudioPad (shown in class)

http://www.jamespatten.com/audiopad/

From the site: "Audiopad is a composition and performance instrument for electronic music which tracks the positions of objects on a tabletop surface and converts their motion into music. One can pull sounds from a giant set of samples, juxtapose archived recordings against warm synthetic melodies, cut between drum loops to create new beats, and apply digital processing all at the same time on the same table. Audiopad not only allows for spontaneous reinterpretation of musical compositions, but also creates a visual and tactile dialogue between itself, the performer, and the audience. "

 

Squeezables

http://xenia.media.mit.edu/~gan/Gan/Education/MIT/MediaLab/Research/Squeezables/

From the site: "The Squeezables is a computer music instrument that allows a group of players to perform and improvise musical compositions by using a set of sque ezing and pulling gestures. The instrument, comprised of six squeezable and retractable gel balls mounted on a small podium, addresses a number of hardware and software challenges in electronic music interface design. It is designed to provide an alternative to asynchronous and discursive interactions with discrete musical controllers by allowing multiple channels of high-level simultaneous input. The instrument also addresses new challeng es in interconnected group playing by providing an infrastructure for the development of interdependent, yet coherent, multi-player interactions. As a test case for a particular high level control and interdependent mapping scheme, a short musical composition was written for the instrument and was performed by three players. "

 

Reactable

http://reactable.iua.upf.edu/

From the site: "The reactable is a collaborative electronic music instrument with a tabletop tangible multi-touch interface. Several simultaneous performers share complete control over the instrument by moving and rotating physical objects on a luminous round table surface. By moving and relating these objects, representing components of a classic modular synthesizer, users can create complex and dynamic sonic topologies, with generators, filters and modulators, in a kind of tangible modular synthesizer or graspable flow-controlled programming language. "