What it does

It navigates a given scene and notifies whether it has identified a fire. It is 80% accurate on my test data. It functions well on simple scenes.

How I built it

I built a data set with labels "fire and "nofire". I split images into smaller ones to train on the limited context required by the hackathon. I trained a convolutional neural network on the data set. Played with number of layers and amount of data processed to achieve current result. Used Python and Microsoft Azure Machine Learning Services.

Challenges I ran into

Many images of fire I found were very low resolution. I don't have a lot of experience with neural networks so this was a learning experience for me but I greatly enjoyed it. I've never made a data set before.

Accomplishments that I'm proud of

I'm proud that I made my own data set to train on, I had never done this before and was pleasantly surprised by the results.

What I learned

I realized that much of my data focused too much on closer fire and I should've tried to get more images in which fire appears farther away. I think I could improve upon the model if I made a label for smoke and worked towards better differentiating bright smoke from fire.

What's next for Do Drones Dream of Electric Fire?

I may try to improve my data set and play with the model more. Will definitely play with image classification more, I would like to improve on this project after leaning more.

Built With

Share this project: