March 31, 2014

A 15-year-old high school sophomore has designed and built a low-cost modular prosthetic arm and hand controlled by the wearer’s brainwaves.

Shiva Nathan, of Westford, MA, says that he designed and built his ArduinoProsthesis after a cousin in India lost both her arms in an explosion. The prosthetics she was using were expensive,

and not very good, he explained to Design News. So he determined to build something better.

Nathan’s prototype uses an off-the-shelf Neurosky Mindwave Mobile headset that uses an EEG pickup to capture the wearer’s brainwaves. The headset sends data to an Arduino open-source microcontroller via Bluetooth. The microcontroller actuates the servos that move the hand. TheMindwave device sends two channels of data, which it terms ‘attention’ and ‘meditation.’ Nathan has used these to control finger flexing and elbow movement. Aside from the electronics, most of his prosthetic is made of acrylic.

Nathan entered his design in the 2013 National microMedic Contest and won the top prize in the student division. Entrants were challenged to use microcontroller and sensor systems to create medical applications and products for possible use in the healthcare industry, medical simulation training, and the battlefield. The contest was hosted and sponsored by the U.S. Army Telemedicine and Advanced Technology Research Center (TATRC), Carnegie Mellon Entertainment Technology Center, and Parallax Inc.

prosthetic arm
Shiva Nathan’s Arduino Prosthesis. (Courtesy learn.parallax.com)

He also won the top student prize in the 2014 Bluetooth Breakthrough Awards. These awards recognize the most innovative uses of Bluetooth technology in products, applications, prototypes under development, and student-led projects in the concept phase.

Nathan also writes iPhone apps and takes precollege classes at the Massachusetts Institute of Technology (MIT). According to an article in the Boston Globe, he is now using Pupil, an eye-tracking technology developed at MIT,  in a more advanced arm. He hopes to convert the eye-tracking signals to impulses that move individual fingers simply by looking at them.

Nathan told the Globe that he hopes to someday work on robots at Google or iRobot Corp. in Bedford (MA), or perhaps to launch a prosthetics business.

Stephen Levy is a contributor to Qmed and MPMN.


No comments

Be the first one to leave a comment.

Post a Comment