Exoskeleton Interfacing
Exoskeleton User Interface
Group: Erich, Alex, Walter
Abstract:
The goal of the system is create an interface between an exoskeleton for paraplegics and its user. The system should discretely integrate into the current architecture of the device or crutches while still providing full functionality. As easy as someone with full motor control of their lower extremities can decide to sit, stand, walk, speed up, or slow down, the user of the exoskeleton should be able to as well.
Current System:
The exoskeleton which the group will be using is in the UC Berkeley Human Engineering Lab. It is a powered system which is attached to the user via a backpack with straps down to the legs and feet. Each leg is powered independently of the other depending on where in the stride the user is. The user then has a walker or crutches to help provide support. Currently there are buttons in the walker that allow the user to stand, sit, move forward at different speeds, slow, and stop. By having the buttons on the handholds, the user must strain to push the buttons. When in an unstable position, quick movements to correct him or herself may cause the user to bump the controls, which is highly dangerous.
The device currently integrates encoders (to determine relative position of the legs), and accelerometers and gyroscopes (to determine position relative to ground).
Proposal:
By integrating the current and additional encoders and accelerometer/gyros, this project aims to create an interface between the paraplegic and his/her exoskeleton. The user interface must give the user full control, as if the legs were his/her own. It must also not put too much strain on the user, physically and mentally. This includes the need for discreteness, to avoid embarrassing movements or requirements. By offering a feedback system using LEDs and/or sound, the user could have a seamless interface, while still ensuring correct actions by the device.
Further:
In addition to the buttons, encoders and accelerometers/gyros, possible other interfacing systems include a non-intrusive, brain machine interface (which we have access to) and voice activation. Anything is possible though.