Midterm Project Proposal: Remote Fingers

Posted by raghavchandra

raghavchandra's picture

 

Remote Fingers

Info 262: Midterm Project Proposal


Members:
  • Apoorva Sachdev
  • Raghav Chandra
  • Karthik Lakshmanan
Problem:
As devices are becoming more advanced, they are also becoming smaller and more complicated. A number of devices today, including cell phones, TV remotes and the like have very small buttons that are closely spaced. This makes them very hard to use for the aged and people with limited motor skills. Remote Fingers aims to solve this problem by providing an intermediate device/interface that interacts between the person and the device.

Concept:
Imagine an array of mechanical fingers that fit over the keys of any device. The fingers can be controlled from an external, large screen. The screen provides a magnified view of the device underneath it. Remote Fingers aims to map interaction with this larger screen onto the actual device which is actually much smaller. This lends it natural application for reading purposes, but with the added power of being able to control a device.

A user might be able to customize his input interface to the mechanical fingers. So, for example, if used on a phone, instead of a phone keypad, a user could possibly input through a computer keypad like configuration. This gives the user abstraction, which is a powerful tool.

Target audience:
  1. People with limited motor skills.
  2. Elderly people with limited vision/poor vision.
  3. People with limited mobility who might find it hard to access remotes and switches
Implementation:
Our project is composed of two components that communicate with each other, ideally through a wireless connection.
The first is an array of mechanical ‘fingers’, which are electronically activated. On an electrical signal, the fingers can press down and release, as would human fingers on a keyboard. A camera captures the real world device below the fingers and feeds it to the second component in real time.

For the second component(‘remote’ interface), we would like to explore two possibilities:
  1. Use an existing touch screen device like a tablet to track finger presses. This allows for mobility, as the user need not even be near the actual device to operate it.
  2. Using a small scale projector to display a scaled up version of the keys below. A camera tracks the finger presses on the surface that the projector displays on. This would allow us to use simpler, more portable and potentially cheaper surfaces like a piece of paper or a table.
Areas to Explore:
  1. Enable the device to read out the options below it, similar to  ABISee’s Eye-Pal Solo. This could prove useful to even blind users.
  2. Interface multiple “mechanical fingers” components so that Remote Fingers becomes one device(Remote) which can easily control multiple ‘fingers’. This would be immensely useful to users with impaired mobility (wheelchair users, bedridden people etc)
  3. Explore other means of input to the device other than touch, including joysticks, head controlled gestures etc.

 

Sample Usage Examples
Dataflow diagram
0
Your rating: None