Inspiration

One of my friends has a prosthetic arm. I wanted to build something impactful for him, but alone (I'm a neuroscience and biology major), I didn't have the skillset to do so. However, at nwHacks, I met a partner with a skillset that complemented mine (computer science and mathematics), and together we embarked on a wonderful journey to make sensory perception possible in prosthetic arms.

One of us was also recently awarded a Neosensory Buzz haptic feedback device in another hackathon (and it conveniently comes with an Arduino SDK), so we knew right away what we had to do to make our project a reality. 24 hours and lots of hot glue burns and circuit frying later, we're excited to announce that we successfully developed a microcontroller based device that uses haptic feedback to allow people with prosthetic arms to feel human touch again.

What it does

Our device, SensiGlove, uses capacitive touch sensors and an Adafruit nRF82540 based microcontroller core in the form of a glove (compatible with any prosthetic arm) in order to detect human touch. It then sends corresponding signals via Bluetooth Low Energy (through the Adafruit Bluefruit library) to the Neosensory Buzz haptic feedback device to allow people with prosthetic arms to feel those touch signals from the glove through vibrations induced by linear resonance actuators (fancy motors) that stimulate Pacinian corpuscles (fancy pressure receptors in your skin) at various intensities. Over time, people with prosthetic arms can learn to assimilate these signals as an additional sensory input, just like touch.

SensiGlove also has two additional modes that give additional sensory capabilities to the wearer. The user can easily activate and cycle between these modes by sliding the switch onboard the microcontroller core. The temperature sense mode allows for thermal perception in the arm (at an accuracy of within 1 degrees Celsius) via both haptic feedback and LED visualization. Additionally, users can switch to light sense mode and feel light intensity as a sense, effectively allowing them to deduce the brightness of their environment completely blindfolded.

How we built it

We use the Arduino IDE, along with a multitude of Adafruit libraries (such as Bluefruit for BLE communication), and the Neosensory SDK for Arduino for the software component of our project.

Our hardware components were built using an Adafruit nRF82540 with capacitive touch sensors, a light sensor, a thermometer, thin conductive aluminum sheets, wires, and a cut out glove.

Challenges we ran into

One of the biggest challenges we ran into was actually the fact that my arm was not a prosthetic arm, which caused the unintentional triggering of the capacitive touch sensors at each finger, even through the glove I wore to mimic a prosthetic arm's hand. In order to compensate for this, we had to switch from our well designed, high mobility conductive finger cap system to a less mobile and aesthetic metal pad system (solely for demonstration purposes, not for practical use). You can see an image comparison of the two systems in the image gallery. However, this design version could actually be perfect for individuals who still have a human arm, but have lost sensory perception of touch in that arm.

Accomplishments that we're proud of

We're proud that we were able to build something that could be of use to our friend. We're also proud of how we were not only able to finish a fully functional model of our device, but also empower the wearer with extra senses through the switchable modes. We think that with SensiGlove, we're one step closer to having a prosthetic arm feel more like having sensory superpowers instead of feeling like a hindrance. We honestly didn't think that we would finish a working prototype in these 24 hours, and are super surprised at the overall result. Thank you for this opportunity we got to grow and learn through nwHacks!

What we learned

We're both JavaScript / Python type guys so using Arduino and C++ was a big step away from our usual areas of focus. We learned a lot about the overall design process by which to create perceivable senses for humans through haptic feedback, including strategies best suitable to each input data type. We also learned how to work with Bluetooth Low Energy in microcontrollers, and get more (much needed) experience with Arduino and C.

What's next for SensiGlove

We created some pretty neat senses in these 24 hour hackathons, but given enough time, the possibilities with our device are endless. We could have created added a haptic feedback system for proprioception, or used an ESP WiFi module to allow bearers of the SensiGlove to feel stock market prices in real time through haptic feedback. We're really hoping to create a project like this in the future, and hopefully take our current project to the next level through the addition of force sensitive resistors.

Built With

+ 16 more
Share this project:

Updates