(edit: changed to correct category)
Team Members
Shubham Goel, Brian Murphy, Raymon Sutedjo-The, Kristine Yoshihara
Summary
Walking is an experience that is often directional/navigational, i.e. trying to get from point A to point B. Footlert is a type of smart shoe that encourages exploratory walking behavior through virtual tagging and physical feedback. In modern life, our hands are frequently occupied, while our feet remain underutilized. We treat handheld devices (such as smartphones) as extensions of ourselves, yet they are not a fully integrated part of our body. Almost everyone has experienced a moment when they fumbled in their bag for their camera, only to miss a moment that they wish to capture. On the other hand, a smart shoe could potentially be a device that becomes a more natural extension of the body, always present and easily activated with a gesture.
These smart shoes could integrate input from sensors, accelerometers, and GPS in order to provide vibro-tactile feedback to the user. The user, in turn, can provide feedback to control settings on the device through foot gestures, such as tapping, pivoting, or sliding the foot to create electronic tags and mark up the space around them. These tags could be made accessible through an online system, so that users may upload, share, and like location tags, creating a user curated augmented reality.
We have looked into a number of past works that deal with walking and vibro-tactile feedback. We found that their purpose is frequently “navigational”, e.g. to provide directions for a place in a city or moving around properly in a TV studio setting. Our idea is more concerned with discovery and exploration of the environment. The use of a “foot camera”, for instance, opens up interesting possibilities as the user captures the world around him/her from the ground-level perspective.
Use Case
We expect our prototype to possess both a “tagging” function for use in identifying and finding specific locations of interest as well as a “camera” function that allows a user to take photos from their feet. In combination, these two functions allow a user to give and receive spatial feedback in places where other devices are not ideal (ie. a walk in a park) and to capture and contribute photographic information related to these places. The expected output of this prototype is both functional and artistic. It is intended to evoke an alternate perspective by imparting digital information on pristine physical objects while also capturing images of the world from a different angle.
The main input/output mechanism for the user to communicate with the prototype will involve physical feedback to and from the feet. Using force resistors, vibration and motion sensors (possibly), there will be a variety of options for creating gestures and alerts that work most harmoniously with normal foot activity. For example, we envision a user activating the photo functionality by clicking the heels together a few times, a gesture that is familiar but abnormal in average foot behavior. Once the camera is “on,” the user might then use pressure on the toe and heel to control its angle and capture images.
We’re also considering how the camera might use motion to orient itself automatically. If we attach it to a servo motor, that motor might rotate the camera forward when a user is walking forward and then aim it skyward when a user is standing or walking up stairs. Playing with angles of photographic interpretation is an important component of this project. We’ve found that the world looks quite a bit different – and very intriguing – when viewed from the ground up. Exposing that alternative perspective to oneself and then others is a primary function of this prototype. We’ve done some preliminary testing to better understand what the end result might look like – please see the attached photos taken at the I School from ground level.
Our key next steps include additional bodystorming to identify foot gestures that are definitive but not uncomfortable and development of the camera mount on the shoe. We expect our existing TUI kits to cover parts of the necessary interactivity; we will be looking into additional components such as cameras to complete our prototype. Additionally, we will also be exploring how motion and vibration can be incorporated with other potential devices.
Materials
- Arduino board
- Camera (compatible with Arduino)
- GPS shield
- GSM shield
- WIFI shield
- Accelerometer (IMU)
- Two stage buttons
- Motors for haptic feedback
- Servo motor (for controlling camera)
- Shoe
- FSR
- Battery pack
Implementation
We are planning to mount all the electronics on the left shoe for the first prototype. The shoe will be self contained and battery powered. It will record location using the GPS shield and photographs using the camera shield. It will upload the clicked photographs to remote server and connect to the internet using the GSM shield. The two stage button and the FSR will be embedded in the heel of the shoe and will act as a shutter button for the camera. The accelerometer based IMU will be used for detecting gestures like the tapping the shoe.
- Login to post comments