Ultrasound technique and sample scans
Sample 3: No IJ Present; Nothing detected
Sample 7: Edge case, algorithm is close but guesses wrong
Central venous lines, medical devices that deliver drugs to the central vasculature of the body, can not be placed by anyone other than a highly skilled physician. This is for a very good reason - one slip up in placing a medical device in to the central venous system and a patient can exsanguinate (bleed out). This project, in its current state, provides a way for medical experts to be sure of the location of the internal jugular vein when the are preparing to place a central line. However, it is a key stepping stone to a larger goal: automating the central line placement process.
Voyage Biomedical is currently developing a breakthrough medical device for cooling the brain during cardiac arrest and stroke. The Boreas central line is first device to enable deep brain cooling during a cardiac arrest to keep the brain alive while blood has stopped flowing to the brain. However, there is one major limitation to the patients that Boreas can help; it must be in place prior to the arrest for it to save the brain, and therefore can only be used in-hospital as there are no doctors on ambulances for out-of-hospital events. This computer vision/machine learning is the base for which we can build a robotic solution to place central lines outside of the hospital. If an algorithm can accurately detect the location of the internal jugular vein via ultrasound, then a robot can be programmed to correctly place the line. This would allow for the Boreas central line to be implemented on the scene of an out-of-hospital cardiac arrest, increasing our reach to over a million patients per year and solving a $25B/year problem in first response medicine.
What it does
This product makes use of machine learning and computer vision to recognize the location of the internal jugular vein on an ultrasound scan.
How I built it
This software was built using the deep learning and computer vision toolkits in Matlab. A 15 layer convoluted neural network was trained with 50,000 stock images available online from University of Toronto to refine it's ability to recognize objects in an input picture. After this training, the top layers of the neural network were scraped off, and retrained using data from 50 ultrasound images of the neck. The resulting artificial intelligence algorithm proved incredibly effective at identifying the IJ on new never-before-seen ultrasound images.
Challenges I ran into
I'm a mechanical engineer, and Matlab is the only coding language I know well. I started out trying to write the program in Python, but ran in to all sort of problems trying to get OpenCV work. When I decided to transition to my comfort zone, Matlab, I made incredible progress. I had no idea that Matlab had deep learning or computer vision functionality, but was pleasantly surprised by the ease of use once I transitioned.
Accomplishments that I'm proud of
Before this hackathon I had used neither deep learning nor computer vision in a project before. In fact, I haven't written code in Matlab for 3 years, as all of my last recent roles have been purely hardware engineering. The last 22 hours was spent teaching myself to use these two complex technologies from the ground up. Shout out to Matlab for having incredible documentation and demos that allowed me to go from zero knowledge of ML/CV to a fully functional and amazingly accurate model in less than 24 hours.
What I learned
All of the basic (and even some of the advanced) skills in developing computer vision software backed by deep learning.
What's next for Voyage Vision
I will continue developing this technology at Voyage Biomedical, so that one day Voyage's life saving medical device can save the brains of patients who go in to cardiac arrest or have a stroke outside of the hospital.