I was interested in getting into machine learning, though it sounded intimidating. When I learned IBM Satson had visual identification tools, I thought skin cancer detection would be a useful application.
What it does
A user opens their phone to a camera, takes a photo of skin they are concerned about and receive a 0 to 1 score if their skin condition is malignant or benign.
How I built it
I built it using IBM Visual Identification tools on IBM cloud, which helped me deploy it to a Swift app for easy application.
Challenges I ran into
I wanted to integrate IBM Assistant to create a chatbot that could receive the photos and give back the score. However, this was purely to make my project more complex. It was difficult and I spent a lot of time trying to integrate multiple IBM tools. It was a challenge letting go of that idea, as I realized that it was simpler for UX to leave it as it is.
Accomplishments that I'm proud of
I never worked with iOS Swift, Machine Learning, or IBM Watson. I really went out of my comfort zone to learn new technologies!
What's next for Skin Cancer Detection App
I want to upload more and more photos to make the app as accurate as possible. Perhaps, if I could get the chatbox integration to work simply and quickly, I would consider adding that feature again.