We wanted to increase first aid accessibility to people with limited resources.
What it does
By simply texting a picture to our dedicated number, our service is able to use image processing to determine the severity of a burn. It then replies with a diagnoses and quick treatment options.
How we built it
Our project flow goes a little something like this:
- A user sends an MMS (or media message) to BurnBuddy
- The image is grabbed by the hook and sent to our trained model through Clarifai API
- The result is a percent confidence of the diagnosis (first, second, third, or fourth degree burn)
- Then a response is sent with the diagnosis and treatment recommendations
We used Twilio to support the SMS/MMS receiving and sending, and ngrok to deploy the local server to make our webhook communicate with Twilio. Using Clarifai API we trained a model that would be able to recognize burns based on severity using a dataset of over 100 pictures. The webhook would receive images, send them through Clarifai API, and send the response with the appropriate treatment option.
Challenges we ran into
Figuring out how to extract an image from messages through Twilio proved to be difficult due to poor documentation.
Accomplishments that we're proud of
Overcoming our challenges. Making a hack for good hack that is easily accessible.
What we learned
WebHooking, Clarifai, and Twilio.
What's next for BurnBuddy
BurnBuddy is capable of growing beyond burns and diagnose other common skin conditions.