Inspiration

When we first formed our team for AthenaHacks, we made plans to come together and create an idea. One of our members brought up how difficult it is to identify different types of flowers when you walk around outside or go on hikes. Inspired by this concept of identifying different types of flowers, we decided to combine machine learning with mobile accessibility to create an application that could recognize different kinds of flowers for the user.

What it Does

Our app combines back-end node.js and is integrated with Microsoft Azure Custom Vision using front-end React Native to deliver an object identifier that can take a picture of a flower and recall what type of flower it is.

How We Built It

We first started with React Native in order to create a mobile app that would work on both iOS and Android. One of our team members was delegated to design and create mock-ups for multiple screens by sketching the flow of the app. We designed our home screen for the application first, and then we began adding in our buttons and camera functionality.

We first began implementing the camera function through a tutorial on React Native, and then made it our own by adding additional buttons to cater to our specific application. Then we went onto changing the fonts we had available by adding Google fonts to use within our app.

Once the primary functions for the app were created, we went on to work with the Microsoft Azure Custom Vision software in order to train it to recognize nine different species of flowers. The last step was connecting our camera function to our trained machine.

Challenges We Ran Into

We did not expect how steep the learning curve would be for React Native. When we first began programming, it was unlike any we had used before. One of our team members also had issues throughout the entirety of the event with downloading React Native.

During our first attempt to create the camera function we had several complications such as the dimensions of the camera screen, taking a picture instead of taking a video, and transitioning from the home screen to the camera. Since the Microsoft Azure Custom Vision trains via an online web browser, we had many issues bridging between Microsoft Azure Custom Vision and Native React. The Azure Custom Vision was very difficult to integrate into node.js to be used in our React Native app.

Accomplishments that We're Proud Of

Back-end coded the machine learning through Microsoft Azure Custom Vision and bridging it to the React Native front-end.

What We Learned

We were too ambitious when we attempted to use React Native rather than use Android Studio or XCode. Lisa and Ashlyn learned more about HTML and Javascript as beginner hackers attending their first hackathon. Cheyenne and Grace gained insight into React Native and node.js.

What's Next for Sunflower

Adding on additional features such as creating a list of the flowers scanned by the app, adding additional information about each flower through the app, and offering an option to create a bouquet of flowers to send a message for an occasion like a birthday, party, wedding, or marriage.

Link to Repository

https://github.com/suncheyn/Sunflower

Built With

Share this project:

Updates