Team Members
Shubham Goel, Brian Murphy, Raymon Sutedjo-The, Kristine Yoshihara
Project Description
Most Tangible User Interfaces appeal to our hands, but what about our feet?
We propose a type of smart shoes that track your path in your everyday life. These smart shoes could potentially integrate input from sensors, accelerometers, and GPS in order provide feedback to the user. The user, in turn, can provide feedback to control settings on the device through foot gestures, such as tapping, pivoting, or sliding the foot.
There are a number of possible features and affordances which could be included in this device. In an advanced version of the device, we envision users integrating the shoes into their daily lives. A user would receive tactile feedback while walking around: vibration patterns in the shoes might warn of local dangers, such as uneven terrain or if the user began walking into a neighborhood that’s potentially less safe. Users might walk around town and save paths as a visual representation on some container, either a laptop or a phone, tapping their feet at key spots to save important locations. In another possible interaction loop, the shoes might take in sensor data about the user’s walking tempo and pace and transform it into a musical pattern, providing environmental embodiment. Finally, this device could be particularly useful in instances where the user is carrying something and cannot use their hands, for example, a foot swipe to turn on their mp3 player or to open an automatic door.
There are a number of key differences between the hands and feet. The hands are more dexterous and more sensitive. There is a higher density of touch receptors on the hands, which enables us to distinguish subtle differences in pressure and intensity when we touch various objects. From a design standpoint, this necessitates careful planning to ensure that the user can differentiate among the different signals transmitted by our smart shoes. Currently, we believe that each foot may be divided into at least six zones, in which input to one zone may be distinguished from each of the other five. Between the two feet, this presents twelve possible zones that can receive stimuli from our device.
Additionally, there are a number of different types of signals which may be used to give the user feedback from the device. Mechanical signals include vibration (which can be transmitted in differing patterns and intensity for different signals) and changes in pressure (which could be administered through an ankle cuff). Harsher vibrations are more intrusive and would correspond to significant events, such as if the user was about to trip and fall. Possible thermal signals include a change in temperature, if feasible. For example, the toe of the device might warm, communicating input from sensor data, warning of a change in the weather.
The possibilities expand when we consider multiple users. For instance, one user might connect his or her shoes to another user’s to share walking directions. If users make their locations public to one another, shoes might give input communicating when friends are in the same area.
This device seeks to integrate something we do everyday, walking, and transduce it into a smart way of planning your day, finding your way, and interacting with the world when your hands are tied.
Possible areas of expansion
- Marking presence or checking into places by kicking the door or some other gesture.
- Prevent the user from falling and help him/her navigate when he/she might be visually distracted(texting or looking up directions on yelp)
- Communicate to the user conditions of the surface below the shoe sole.
- Help user establish a better standing posture by measuring weight balance between the two feet and the toe and heel of the shoe.
- Provide navigation on a walking route without requiring user to look at phone/map/etc.
- Login to post comments