This project was inspired by the concept of using biomedical optics to enhance quality of life and to improve healthcare. Optical technology can do this through less invasive methods for monitoring, diagnosing, and treating patients. In the United States, sudden infant death syndrome, also known as SIDS, affects thousands of babies every year. Living in a world where technology is growing stronger every day, we wanted to tackle this issue through a more powerful baby monitoring system. Rather than having parents wake up in the middle of the night to check in on their child, we strived to create a system that was smarter, non-invasive, and life changing.

What it does

Cuddle uses a video feed of a baby from a mobile device or web camera paired with a intelligent image processing engine to alert parents if their child is in need of medical attention. Rather than just provide video and sound to the parent, cuddle analyzes a baby's vitals and only alerts the parents if their baby is under a potential threat. An iOS application simplifies the experience by providing a visual confirmation of the baby, paired with an animated indicator showing the baby's pulse rate. There are no heart rate monitors or wires involved, just a camera and a powerful algorithm.

How we built it

Cuddle's image processing engine, termed Monitre, uses a Eulerian Video Magnification algorithm inspired by Dr. Michael Rubinstein's research at MIT. Monitre uses OpenCV to detect and analyze a user's face, followed by isolating the user's forehead. This region is then locked and analyzed for its optical intensity over time. Our algorithm isolates the green channel for analysis. The physiological data is then analyzed using photoplethysmology and optical absorption characteristics of oxygenated hemoglobin. The Montre engine serves as a RESTful API for the Cuddle app. Cuddle was built on iOS using Swift. One device can be registered as the "Cuddle" device, which sends the video feed to the API, while the other device can be registered as the "Cuddler" device, which receives the interpreted video feed with the baby's vitals from the API.

Challenges we ran into

One of the most difficult portions of this entire project was building out the Montre engine. The algorithms involved were highly complex and theoretical, and making sense of some of the open source code available contributed to a huge portion of the development experience. After retrieving a heart rate, we compared the results with a commercial heart rate monitor, and found some issues with the algorithms accuracy. After careful calibration and data manipulation, we were able to achieve an accuracy of approximately 94%, which is physiologically relevant enough to determine if a baby is at risk of death.

Another challenge was our attempt at providing live video feed support, which would have been difficult to handle client side between two devices. Our strategy thus turned to syncing a recording every 5 seconds, which, although involved a small lag, was able to reduce development time without sacrificing a working, physiologically relevant prototype.

Accomplishments that we're proud of

We were able to successfully build out an engine that can analyze your heart rate just from a video! Achieving such a high accuracy with limited data was an incredible feat that the team is proud of.

What we learned

A lot was learned about image processing capabilities and algorithms, which involves a lot of filtration, signal manipulation, and peak detection algorithms. Biomedical optics has a lot of promise, and it has been an inspiration for continued projects in the field.

What's next for Cuddle

We hope to provide smarter and more insightful alerts in the application. This can include, but is not limited to, respiration rate, a crying alert (if the baby cries), and discoloration (if the baby turns blue). Respiration rate can be assessed by using a similar Eulerian Video Magnification algorithm to exaggerate the baby's chest motions as the baby breathes, as it is often hard to tell from just a video feed. We would also like to potentially bring the algorithm to the mobile device itself so the app can potentially directly communicate with the parents through a text message alert without the lag of communicating with an API. That would largely depend on the mobile device's computation capabilities.

Share this project: