The National Institute of health shows that recent amputees suffer rates of up to 63% for clinical depression. Going through amputation is a frightening and stressful process. We created Prosthetic AR to help ease that stress. With Prosthetic AR, recent amputees are able to see what a prosthetic limb looks like on their body with simply their laptop camera.
What it does
Prosthetic AR allows people to see what they would look like with a prosthetic limb before purchasing one. We used Machine Learning to determine where joints are in the body, and then use that data in order to attach models of prosthetic joints where applicable.
How I built it
We built the website using ReactJS and Tensorflow.js. The frontend user interface is very simple, but also highly functional. The bulk of the work was building the backend logic using the data given by our ML models. To ensure this works, we used algebraic triangulation to find the position of the limbs and then estimate the overlay image from there. We also used quadratic regression to calculate the angle of the image to make sure it is a seamless attachment to the body.
Challenges I ran into
This was our first time using Tensorflow, so learning how to utilize the API took quite some time but it was not as difficult as we had expected. There was also a surprising amount of math involved in trying to figure out how to calculate the estimated locations of the prosthetic limbs. Additionally, our program tends to run a little slow sometimes and visual lag is noticeable.
Accomplishments that I'm proud of
We are proud of being able to just finish this concept of an application that does it's job. We're also proud that we were able to realize our vision and execute it in a way that none of us would've expected
What I learned
We learned how to utilize the Tensorflow API and incorporate sophisticated mathematical calculations into our applications.
What's next for Prosthetic AR
We hope to improve on our current limb-tracking system by making more accurate through training the ML libraries. We also hope to expand on our vision and eventually incorporate a 3D tracking system so that the user can check out their virtual prosthetic limb at all angles.