Inspiration

There are nearly 2 million people living with limb loss in the United States. Approximately 60 to 80% of such amputees experience phantom sensations in their amputated limb, and the majority of the sensations are painful (this pain is called "phantom limb pain" (PLP)). Pharmacotherapy, surgery, and traditional adjuvant therapy (e.g. physiotherapy, massage, and ultrasound) are also not consistently effective. There has been clinical research showing that mirror therapy -- a way of positioning mirrors so that it visually convinces the amputee that he/she still has the missing limb -- is effective for some patients. We believe that an AR system that enables myoelectric control of a virtual limb and provides a visual simulation of the missing limb will be an effective and useful tool in the arsenal of treatments against PLP. There have been clinical studies that demonstrate high levels of effectiveness for AR in reducing PLP (See, e.g., http://journal.frontiersin.org/article/10.3389/fnins.2014.00024/full, https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3935120/, https://www.ncbi.nlm.nih.gov/pubmed/19191061) but there are currently no commercial, public implementations out there. Moreover, we wanted to leverage the benefits of computational rehabilitation systems in being able to track patient progress, adjust task difficulty, and engage patients.

In addition to visualization of the lost limb, we also want to enable the virtual arm to interact with the real world in useful applications. We believe this will be more convincing and effective for patients who are suffering PLP. Also, in the long run, perhaps the AR system will be able to transcend therapy and enhance day-to-day functionality of people who are missing limbs.

What it does

Our HoloLens application uses myoelectric input from the iMotions Shimmer3 ECG/EMG sensor to predict the amputee's intended muscle movements. We use this input to create an AR visualization of a hand that moves in interaction with the real world. The ultimate experience gives amputees the illusion that they have "regained" their lost limb. This responsive visual feedback helps patients slowly change their inner cortical map (an internal brain map that keeps track of limbs) and reduce PLP.

We also enabled the virtual limb to be able to impact the real world. For example, we can launch music from the virtual limb directly on a computer in the real world.

How we built it

Our project is open source. We've created a backend for the iMotion platform to send data to the HoloLens. This backend can be used in conjunction with any sensor supported by the iMotion platform and is used to push events to the HoloLens. This technology can be applied ubiquitously by other developers to create projects in the HoloLens featuring iMotion biometric sensors. We also created an algorithm to remove noise and detect changes in muscle movements for the iMotion Shimmer3 ECG/EMG. This algorithm can also be used by other developers to create an AR experience that is linked to myoelectric signals.

We wanted the graphics to be immersive thus we used a realistic looking hand. Our Hololens application is built with Unity and is able to pull and push events from and to the real world. We have a complete chain of event management that relies on an local backend, a local server and external server.

Challenges we ran into

The iMotion Shimmer3 did not work at the beginning — it took a lot of tinkering to get it to start working. Inherant complexity to connect all application modules: Shimmer, iMotion plateform, Hololens, backends and servers We were not experts in Unity or the HoloLens so we had a steep learning curve. The HoloLens SDK runs only on Windows and we only had one Windows computer.

Accomplishments that we're proud of

We made the end to end connectivity from the iMotion sensor to the HoloLens! We have a working demo!

What we learned

We started off knowing very little about Unity and the HoloLens. In the past couple of days, we learned a lot about Unity and the HoloLens! We also learned how to incorporate iMotion sensors with the HoloLens.

What's next for HelpingHand

We did not have time to build all the features that we wanted. We currently have simple actions and only one interaction between the virtual arm and a real object. We want to add: (1) a full range of motions and (2) more actions between virtual arm and real objects. Ultimately, it would be cool to have the virtual limb be able to interact with many real world smart objects.

Notes

Team Lead: Julien Bouvier

Phone: 781-298-8165

Location: Room E14-674, Table 38

Vertical Category: Human Well-Being (Education/Health/Wellness/Activism)

GitHub: https://github.com/bouviervj/RealityVirtually_HelpingHand

Built With

Share this project:
×

Updates