Inspiration

Doctors, Students, and upcoming surgeons are striving to better themselves in private practice or for general surgical tasks. Wether for academic or industrial reasons, upcoming surgeons have barely any information to look at when learning how to do a specific surgery. We made Project Open Heart to provide an opportunity to develop a better understanding of what techniques an methods to use in a surgery and ultimately lock in the concepts/surgical steps through VR.

What it does

We provide a full system to a legitimate surgery. The first part of the system is our "Nurse," Alexa. Alexa helps us retrieve specific tools necessary for our surgery. Through simple requests such as, "bring me the scalpel," Alexa responds by adding a scalpel to your hand in the virtual reality atmosphere. The second part is connecting with the oculus and actually being hands-on with the entire surgery experience. We provide recordings of the surgery so other people can analyze mistakes and show others how the specific surgery is conducted. There is also the feature to record your actions and save them to a file. This file can then be played back at a later date for you or others to see how you performed and what you did. Lastly, you can stream your surgery to any others who want to watch it with our setup. Using firebase, we can sync data between the host and the clients with minimal effort.

How we built it

The system has two parts: an Alexa app and a Unity app. The Alexa app leverages the firebase api, amazon lambda, and the Alexa API. Once a certain task is called, we call a function in our Alexa app to send an update to our Firebase Database. This allows for live communication between both the Nurse and Surgeon. Now moving over to the surgeon-side. We use Oculus Rift to allow a full hands on experience of an open heart surgery we used 3d models to show a human body: including skin, flesh, bone, and organs. We attached a Leap Motion to the oculus to allow us to track the hand movements when performing a certain gesture in the surgery. When eventually synced with Firebase, we allow for more accessibility of the surgery performed and that is also done through the Unity application.

Challenges we ran into

Sending http requests through alexa was a pain, huge shout out to the Google employees helping us out through that. Acquiring 3d models to use was not an easy task, most of them are extremely expensive.

Accomplishments that we're proud of

We were excited when we knew that we had a cohesive entire system, from recordings to surgical simulations, rather than just building one part. We were also proud of creating something that we both know will be used one day. There has been a problem for centuries of knowledge between hospitals and surgeons not being sent to each other. Project Open Heart, bridges the gap and truly allows others to learn about the surgery and practice it hands-on

What we learned

We learned how to program for Alexa, though it was a complete pain. We had to learn how to also efficiently utilize our threads in the Unity app. Pushing data from Alexa to firebase was also a great thing we learned and debugged to get right.

What's next for Project Open Heart

We want to implement a machine learning layer on top of the Unity application to allow for robust completion of the surgery without humans. We also want to leverage the fact that this application is very unique. Since this project expands to a vast amount of market, we would love to take advantage of that and present this application accordingly. We also want to provide more robust Alexa features such as "What's the next step" or "How does _____ surgeon do this."

Share this project:
×

Updates