Inspiration

The inspiration behind this project comes from a desire to bridge the gap between physical rehabilitation and immersive technology for individuals with limb differences, like amputees. Traditional prosthetic training can be slow, frustrating, and disconnected from meaningful progress. We envisioned a future where amputees could train, game, and rehabilitate through real time, intuitive interactions helping them rebuild confidence and motor skills in a way that feels both rewarding and empowering.

By combining low-cost hardware like Arduino and resistive strain sensors with the creative flexibility of Unity, we sought to make advanced neuroadaptive rehabilitation tools more accessible and engaging. Our goal was to show that with a few key components and an ambitious idea, we can reshape how people with disabilities interact with the digital world and eventually, with their own prosthetic devices.

Ultimately, we believe that empowering users to "move beyond their limits" isn't just a technical challenge but a human one that is worth spending the time and effort on. We all deserve to feel empowered & accommodated, especially in a world where life changing technologies are emerging.

What it does

This project focuses on creating an interactive virtual rehabilitation system by linking real-world robotic feedback to a virtual environment. A robotic hand, controlled through Arduino IDE and equipped with resistive strain sensors, is used to grasp a physical ball. When contact and grasping are detected via the sensors, a corresponding virtual hand in Unity mirrors the action in real-time, recognizing when the ball has been successfully picked up.

The system is designed not only as a tool for immersive gameplay but as a research platform to study and compare the neuromuscular signals generated by an amputee’s residual limb (nub) versus those from a healthy human forearm and hand. By capturing real-world interaction data and synchronizing it with a virtual model, this technology provides valuable insight into improving prosthetic control systems and rehabilitation strategies.

Ultimately, this project aims to empower user movements and shape the future of rehabilitation by bridging physical and virtual motor control in an accessible and measurable way.

How we built it

First, a robotic hand was assembled and programmed using the Arduino IDE. Resistive strain sensor is placed on the finger of the hand to detect pressure and touch which specifically identifies when the hand made contact with a ball.

The Arduino collects the sensor data and interprets changes in pressure as a "grasping event." Once a grasp is detected, a signal is sent to a Unity application through a serial connection. In Unity, a virtual hand model mirrors the real-world hand's actions. When the system recognizes that the real-world ball has been picked up, it triggers the virtual ball to be grasped and manipulated in the game environment.

Special care was taken to calibrate the sensors to accurately detect different levels of force, ensuring that light touches wouldn't falsely trigger a grasp event. The system architecture allows researchers to input data either from an amputee’s residual limb (nub) or from a healthy human forearm by analyzing the differences in force feedback and control strategies.

Through this setup, the project enables simultaneous real-world robotic interaction and virtual environment feedback, creating a foundation for future rehabilitation training and prosthetic control research.

Challenges we ran into

We had some trouble with coding our GUI to create graphs based on three columns of data we collected from the simulation however with the aid of GitHub Copilot we were able to identify the issues and fix them accordingly.

Accomplishments that we're proud of

We were successfully able to hook up the in game hand to reflect what the real world sensor's signals read. Due to this the sensor can be put onto an amputee's "nub" or a typical human finger/body part and either option will result with the in game hand moving appropriately.

What we learned

How to connect real world sensors to virtual environments in Unity, how to process and clean noisy sensor signals, the difference between amputee vs. healthy muscle signals in relation to potential prosthetics applications, how to build a prototype under pressure, how to pivot when the initial plan relating to hardware or code didn’t work!

What's next for Cybathlon VR

We plan to use this prototype as a research tool to study how amputees interact with prosthetic devices. By integrating machine learning and expanding to EMG signal analysis, we aim to better understand motor control strategies and improve future prosthetic designs. Our ultimate goal is to help shape the next generation of more intuitive, responsive prosthetic technologies.

Built With

  • api
  • arduinoide
  • pizeoelectricsensors
  • unity
Share this project:

Updates