Inspiration

Our inspiration came from Carleton's annual Butterfly exhibit and how people could learn more about them

What it does

Butterfly Explorer uses a pair of neural networks, one responsible for identifying whether a butterfly was present in an image by using a dataset comprised of pictures of various butterflies. This gathered data to teach another neural network how to identify butterflies when taking a picture through our web app which accesses the user's camera to relay an image for the neural network to interpret and return with the corresponding webpage containing information on the butterfly it identified.

How I built it

We made a web scrapper to attempt to extract large quantities of images from google in order to create an appropriate dataset for the main neural network. We also created a web application that is responsible for communicating with a server in order to transmit the image to the neural network which would send the corresponding Wikipedia link of the butterfly it identified.

Challenges I ran into

Dealing with images in a web app proved to be a challenge. So did creating the web scraper as system nowadays are more robust and don't want the wrong people to extract data from their sources. The neural networks were also challenging as they need a great amount of time to teach in order to gain the desired result, so much so given that the web scraper was meant to be used to help the neural network learn that it costed us a lot of time.

Accomplishments that I'm proud of

I'm proud how far we came along throughout this. I'm also proud about how well the web app came along.

What I learned

I now know more about web interfaces and how to manipulate more complex data from web development

What's next for Butterfly Explorer

I'm hoping that we can spend more time refining the neural networks and potentially have a strong source of information that could be benificial for others

Share this project:
×

Updates