Inspiration

We wanted to build a technical app that is actually useful. Scott Forestall's talk at the opening ceremony really spoke to each of us, and we decided then to create something that would not only show off our technological skill but also actually be useful. Going to the doctor is inconvenient and not usually immediate, and a lot of times it ends up being a false alarm. We wanted to remove this inefficiency to make everyone's lives easier and make healthy living more convenient. We did a lot of research on health-related data sets and found a lot of data on different skin diseases. This made it very easy for us to chose to build a model using this data that would allow users to self diagnose skin problems.

What it does

Our ML model has been trained on hundreds of samples of diseased skin to be able to identify among a wide variety of malignant and benign skin diseases. We have a mobile app that lets you take a picture of a patch of skin that concerns you and runs it through our model and tells you what our model classified your picture as. Finally, the picture also gets sent to a doctor with our model results and allows the doctor to override that decision. This new classification is then rerun through our model to reinforce the correct outputs and penalize wrong outputs, ie. adding a reinforcement learning component to our model as well.

How we built it

We built the ML model in IBM Watson from public skin disease data from ISIC(International Skin Imaging Collaboration). We have a platform independent mobile app built in React Native using Expo that interacts with our ML Model through IBM Watson's API. Additionally, we store all of our data in Google Firebase's cloud where doctors will have access to them to correct the model's output if needed.

Challenges we ran into

Watson had a lot of limitations in terms of data loading and training, so it had to be done in extremely small batches, and it prevented us from utilizing all the data we had available. Additionally, all of us were new to React Native, so there was a steep learning curve in implementing our mobile app.

Accomplishments that we're proud of

Each of us learned a new skill at this hackathon, which is the most important thing for us to take away from any event like this. Additionally, we came in wanting to implement an ML model, and we implemented one that is far more complex than we initially expected by using Watson.

What we learned

Web frameworks are extremely complex with very similar frameworks being unable to talk to each other. Additionally, while REST APIs are extremely convenient and platform independent, they can be much harder to use than platform-specific SDKs.

What's next for AEye

Our product is really a proof of concept right now. If possible, we would like to polish both the mobile and web interfaces and come up with a complete product for the general user. Additionally, as more users adopt our platform, our model will get more and more accurate through our reinforcement learning framework.

See a follow-up interview about the project/hackathon here! https://blog.codingitforward.com/aeye-an-ai-model-to-detect-skin-diseases-252747c09679

Share this project:
×

Updates