Nobody really wants to go to google image search and lookup how Ulcers looks like and if you have it. That's why with a simple app that only requires access to your camera you can learn if you'll need to have your arm amputated because it's a cancerous mole (or if you just touched poison ivy).
What it does
Using your phones camera to identify what skin ailment you have(hopefully none). Also identifying if moles you have are cancerous or not. Then advising you what actions you should take and opening links about the disease.
How I built it
Project was built using Microsoft Custom Vision API. Example Android application with TensorFlow models was taken as the initial prototype, and then modified to meet requirements for the project. The skin ailment recognition was
Challenges I ran into
Having to sift through hundreds of skin ailment pictures by hand. Working with a brand new API and general Android Development issues.
Accomplishments that I'm proud of
Managing to pull-off a demo-ready product in 24 hours in a team of 2. Learning about the Microsoft Vision API, and remembering Android development.
What I learned
Working with Microsoft Vision API
What's next for ChckUp
Increase the amount of recognisable skin conditions up to 20+, while maintaining accuracy and recall