Inspiration
Burns are bad. Lots of people suffer terribly from them, and we, as a medical community - don't know that much about them. Our project seeks to address two major issues in the field of burn injuries:
1) Diagnosis through Quantitative Metrics Status quo diagnosis of burn wounds is left to the eye of the burn surgeon/dermatologist, as per the following 4 characteristics; Depth, Cause, Appearance, and Level of Pain. (Diagnostic Parameters) Of these metrics, the distinction of depth is often unclear externally, the latter two are arguably immeasurable objectively, and the last often has no standard of comparison (which complicates the entire diagnostic system). This uncertainty manifests itself in the relatively high variance of burn diagnoses for identical images in double blind studies, even amongst leading dermatologists in the field.
By measuring burn depth by proxy with a newly substantiated biometric (heat, as a function of collagen denaturation), one can objectively determine the damage and full topography of the wound. 2D thermal images can be processed, and using the linear nature of heat conduction, can use 3D temperature levels to indirectly derive wound topography.
Much like an iceberg, a burn only demonstrates a small amount of the wound externally, where there is extensive tissue death beneath the surface both immediately, and uniquely - experiences continued apoptosis after removal of heat stimulus.
2) Noninvasive Injury Progression Tracking Current widespread method of depth and wound analysis is done through invasive biopsies. Not only do these inherently disrupt the wound healing process, but they only give one unique data point per biopsy. Due to the organic/nebulous shapes of burn wounds, however, these 'guessing points' from biopsies do not nearly paint an accurate picture of the state of the entire burn. In fact, depending on where they are taken, they can improperly generalize the state of the entire wound and lead to incorrect, or at best, quasi-arbitrary dosage information.
What it does
Takes a 2D IR image as an input, creates a dynamic 3D model, computes pertinent wound properties, diagnoses burn degree, calculates post-trauma fluid requirement.
How I built it
Python serves as a middleman by taking a 2D orthogonal thermal image from our iOS/Android frontend input to our Mathematica script on the backend, called via the Wolfram Cloud Platform RESTful API. Image is decomposed pixel by pixel, assigning a given Z (depth) value to each coordinate in the XY plane based on the RGB color value from the original image. Points are then restructured and plotted using Wolfram 3DMap and Plot functions, smoothed to a surface of Isotherms, exported, and handed off to our front end again using Python for dynamic 3D Visualization. Further analytics are processed using Wolfram to attain values for burn degree, various other burn characteristics (surface area, volume), and estimated recovery time (note: varies significantly per person, so it's fairly inaccurate, but we wanted to implement the capability as the body of research attains a more accurate rate function).
Minimum fluid requirement in 24 hours post-trauma (as per the Parkland Equation), is calculated, an essential treatment parameter for burn victims. "Burned Body Area %" is a fairly difficult value to attain for most people, so we offer a much more intuitive system that includes either estimating the wound size relative to the effected limb(s) (see: [Wallace Rule of Nines])(http://www.remm.nlm.gov/burns.htm), or even easier, a 'Draw' option onto a vectorized body, which then automatically computes the % wound coverage by comparing altered pixels to original pixels.
Challenges I ran into
We don't have any thermal camera hardware here, but there are multitude that exist in industry (Nick worked with a couple in the past when he conducted Dermatology research at Stony Brook Medical Center). If possible, we would want to work with the new FLIR One. They are small enough to easily be transported for medics or emergency personnel, and can seamlessly be integrated with either iPhone or Android phones into any medical professional's repertoire - whether it be a large hospital or a small private practice.
Accomplishments that I'm proud of
We were pretty stumped and demoralized, not coming up with a project idea until Saturday morning.
None of us have any mobile dev experience, and we were still able to push functioning (and arguably, good looking) iOS AND Android apps. We literally coded our first lines in Android Development Studio and Swift 24 hours before this was submitted.
What I learned
What's next for 3Degrees
- Improving relative size accuracy by implementing more precise depth calibration.
- Integrating some sort of infrared camera to directly streamline the entire diagnostic process from image capture to analytics and injury progression tracking.
- Make things prettier. As this was our first time making mobile apps and using the Wolfram Alpha/Mathematica cloud platform, we were heavily prioritizing getting all core functionality working before all else.
- Integrating Wolfram Anatomy Libraries
- Building out a proprietary dataset from user uploaded images (heat/depth over time) to begin training a Machine Learning model for burn injury progression.
Built With
- android
- flir
- mathematica
- python
- swift
- wolfram-technologies
Log in or sign up for Devpost to join the conversation.