Low-cost 3D-printed prosthetic hand to be tested on amputees in Ecuador
November 5, 2014
The electromyographic (EMG) prosthetic hand uses a machine-learning algorithm and pattern recognition to add to its functionality, extending it beyond mere open and close actions. An initial training period of one or two minutes takes patients through each of five mathematically-modeled gestures: a hand at rest, open-faced, closed, doing a three-finger grasp, and doing a fine pinch. Then the machine-learning kicks in such that the microcontroller in the prosthetic can figure out which grip the patient is trying to make and replicate it.
This is possible because muscle contractions form an electric signal. That signal is first sent to an EMG board, then on to a microprocessor that converts the signals into commands for the motor drivers that make the hand move. The EMG board is currently the size of an audio mixing board, but the team plans to eventually shrink it down to fit into the socket of a patient’s residual limb.
A meeting with Range of Motion Project co-founder David Krupa led to the students spending two weeks in the Ecuadorian capital Quito in August, where they managed to get the hand working on Juan Suquillo, who lost his left hand in a war with Peru some 33 years ago. They plan to return in early January with a new iteration of the prosthetic, which they will leave behind with a patient.
This new version will replace the motors with linear actuators for added strength, energy efficiency, and battery life. It will also switch to a four-bar linkage so that the tendons can bend more than one joint at a time. “The four-bar linkage will allow the joints to bend more smoothly and naturally,” said team member Patrick Slade. “It will also be more robust and simpler to maintain.”
Perhaps more intriguingly, the next prototype will provide sensory feedback – something which is also being worked on at DARPA and several other institutions, but has not yet reached commercial application.
“We’re going to put sensors in the fingers,” said team leader Aadeel Akhtar. “Based on the amount of force that the fingertips are detecting, we are going to send a proportional amount of electrical current across your skin to stimulate your sensory nerves. By stimulating your sensory nerves in different ways with different amounts of current, we can make it feel like vibration, tingling, pain, or pressure.”
Moreover, the new hand will make a mechanical connection – what the team defines as a passive linear skin stretch device – from one of the artificial fingers directly to the skin so that users can distinguish between different grips without looking at their prosthetic hand. Testing of this connection in the lab resulted in subjects classifying six different grips with 88 percent accuracy, as outlined in this paper from the EuroHaptics 2014 conference.
The hand is neither the first 3D-printed prosthetic in general nor the first 3D-printed prosthetic hand in particular. Previous efforts have replaced a cancer patient’s upper jaw, provided a $50 alternative for a man with a $42,000 myoelectric prosthesis, and just recently transformedknee replacement surgery, while an online network is building steam in bringing 3D-printed prosthetic hands to the world that are rather simpler than commercial offerings or even the UCIC team’s work.