Inspiration
Vital signs are literally vital parameters in medicine that need to be assesed frequently and diligently. Often times to do so you have to disturb patients in their sleep and apply a device to measure the heartrate. This is not only timeconsuming for caregivers, but also disturbing a patient's recovery.
What it does
Simply by looking at the patient with your HoloLens you get a heart rate estimate and a realtime visualization of the heart beat is displayed.
How we built it
We built our own Unity application that aggregates the heart rate data into a graph and also displays synchronized, on-beat heart visualization on the HoloLens. The backend for getting the BPM estimates is based on a open-source neural network library (pyVHR).
Challenges we ran into
- Getting low latency image streams
- Interacting with the exisisting library
- Integrating the calculated values into our Unity setup
Accomplishments that we're proud of
- That in general it works!
What's next for Touchless Heartrate
- Reduce latency for images
- Try different estimation algorithms & perform measurements in real life
Team Contribution
- Trishia: deploying neural network model, Python & Unity pipeline
- Marc: streaming RGB video from HoloLens & heart animation
- Fabian: researching NN models, trying different libraries
Note
_ The demo video we uploaded is a capture of Unity on the Desktop and not the HoloLens application - as we are using the HoloLens to stream and can not simultaneously record. We will however do the live demo on the HoloLens. _
Log in or sign up for Devpost to join the conversation.