Inspiration

A dog belonging to one of our professors recently contracted an incurable form of cancer. It took the community by surprise and we felt the need to create a tool that could help people detect skin cancer in its early stages.

What it does

We take and send a picture of a skin sample to a web server to be processed. The web server returns a prediction made by a Deep Learning algorithm. This prediction is sent back to the user through text.

How we built it

We used the Twilio SMS/MMS API to send and receive messages/images. We used FastAI, a deep learning library built on top of PyTorch, to process images using a pre-trained ResNet50 neural network. This network is trained on images of benign or malignant lesions from Udacity. We then process the image that was received through Twilio into the neural network and return the prediction to the user through text.

Challenges we ran into

Pre-processing the dataset for the neural network took some time as most of us are new to implementing neural networks. Additionally, implementing the Twilio API took some time as configuring the MMS messages took some time as we had to connect Twilio with a remote Virtual Machine through a client known as Ngrok.

Accomplishments that we're proud of

For most of the team, this was our first deep learning project.

What we learned

We learned the steps needed to apply deep learning to any given task.

What's next for fastscan.ai

We plan to expand our dataset in order to increase accuracy. We also plan to develop an app interface to streamline the user experience. Additionally, we plan to build upon the pre-trained architecture so that we can make more complex predictions that are required in order to identify early stage melanoma.

Built With

Share this project:

Updates