Approximately 50% of dermatologists aren't trained to accurately diagnose people of color. Delayed diagnosis of skin conditions can make it difficult to treat the condition effectively and can worsen the skin condition.
What it does
We use an ML trained model that analyzes hundreds of pictures of skin conditions with people of different colors. This is key because as we stated before people of color often don’t get a proper diagnosis because they aren’t studied as much. We use augmented reality to label and classify the area on the users skin toe enhance the user experience. We also provide a symptom tracker and additional information to accurately diagnose the user further.
How we built it
We built our app using the platform Xcode and the language SwiftUI. In our project we also utilized ARKit and CoreML for our augmented reality and image detection feature.
Challenges we ran into
The most challenging part of our app was being able to correctly identify a skin surface and display an AR virtual button over it.
Accomplishments that we're proud of
One thing that we are proud of is making our own custom CoreML model with 94% accuracy. Right now it can only identify three conditions but we had to individually annotate hundreds of pictures which was a major accomplishment.
What we learned
Through building this app, we learned how to integrate CoreML and augmented reality. We also learned how to create and train a custom ML model.
Log in or sign up for Devpost to join the conversation.