Inspiration
We are high school students who was experimenting with ways to improve people's everyday lives. After some thought, We came up with the idea of using machine learning in a healthcare-type application that allows users to take a picture of their skin and self-diagnose themselves for whether they have a type of skin cancer or not.
What it does
This is an app to detect and self-diagnose skin cancer using machine learning.
You take a picture of your skin using the native mobile application, then the picture is sent to my secure cloud server on Azure, where the Azure Machine Learning Service can classify the image by comparing its color and frequency of certain distinctive features of diseased skin to your skin, and finally, the output is sent back to the app where you will receive the probability of having each type of skin cancer. There is also the history tab, which displays your past uploads, and the medical profile tab, which allows you to check the symptoms you have from a drop-down menu.
How we built it
We used Android Studio for the application, Azure for the machine learning and cloud servers, and Kaggle for the database of images to train the machine learning algorithm.
Challenges we ran into
Machine learning accuracy, it sometimes predicted incorrectly due to factors like background lighting and other objects.
Accomplishments that we're proud of
Synchronization with the cloud and app functionality.
What we learned
How to develop an android app, train and predict using Machine Learning, and utilize an Azure server in the cloud.
What's next for SkinSense
A larger database of images to add support for more types of skin cancers, improvement of the machine learning accuracy, and bringing the product to the market to make a real-world, lasting impact.
Built With
- ai
- android-studio
- azure
- custom-kaggle
- java
- vision
Log in or sign up for Devpost to join the conversation.