When we came into the hackathon we didn’t have a clear idea about how we could improve, but after researching environmental issues, we found a huge problem that needed fixing. In the farming world, annual losses of 30 to 50 percent of crops are not uncommon. Why is this number so high? The main answer to this shocking statistic is diseases. Plant based viruses can decimate whole fields if given the time to spread enough, so we made PlantAId.

Farms around the world provide us with 70% of all food, so clearly this issue is of utmost concern This app has the potential to be revolutionary for farmers who will be able to find viruses in the plants before it spreads, cutting that annual loss of crops due to diseases down by large margins. Whether the user is a farmer, who depends on these crops to make a living, the consumers, some of whom are starving from this lack of food, whether the disease is on the leaves, stems, or with the buds of the plant, PlantAId is paving the path to a healthier future, one plant at a time.

What it does

PlantAId has three main functions: Our machine learning model is able to predict what kinds of diseases different crops have or if they’re healthy, and users will then be sent to a screen detailing different symptoms and treatments for the specific disease the crop may have. If the fruit is healthy, they will be taken to a screen detailing how to maintain the prosperity of the plant.

Our third element comes in via a disaster bot, where farmers are able to type in concerns about crops and receive answers from said disaster bot. We believe including this information on our app will save farmers precious time and ways to solve problems such as if their crops are being flooded or are experiencing a drought.

How I built it

The main component of our application is the CoreML based machine learning model. CoreML and CreateML are IOS based machine learning libraries which allowed us to build the machine learning functionalities. We looked for datasets online to help train and test our model in Apple’s CreateML interface, which allows you to easily label and train a machine learning model. After training and testing, we were able to use Apple’s Vision Framework to easily process user images, process them with our model and then output a result. We used AVCam, a camera app framework to create the camera interface within our app.

Next, we used Keras and Python to train a Natural Language Processing Chatbot. With a few lines of code, our ChatBot was able to analyze and process user questions and intents and provide sufficient responses. We did research on common agricultural disasters and added solutions to help farmers in whatever situation they are in.

Finally, we built a database using SweeterSift and SnapKit to build a user-friendly UI which would be easy to navigate. Our database takes in and stores data from user input and can store images and text for easy use and display. After our user inputs their data, our crop tracker simply holds and projects whatever profile our user added for their plant/crop.

Challenges I ran into

A large obstacle we ran into was trying to make the chat bot system work. We had plans previously to make the chatbot work for multiple scenarios, but eventually settled for the disasterbot function and were able to make the system work for the specific category. We learnt that if we asked our ChatBot to perform too many functions, it would often get confused and not understand what our user was asking. We eventually settled by narrowing to DisasterBot, and this worked far better for user input.

Another challenge was implementing the machine learning model, as analyzing plants required a lot of images and tons of data, which was hard to find and then took patience to train. Originally, our model did not work as needed and debugging and pinpointing the issues was a tedious task. Eventually, after re-training our model started working as intended.

Accomplishments that I'm proud of

We are proud of our chatbot, as we had never dealt with Natural Language Processing in swift, and building a chatbot was a new and unique experience which we enjoyed. Working with python and Keras was also a learning experience that could be useful in the future, and was good exposure to the extents of NLP.

We are also proud of our machine learning model, as re-training and gathering data to make an accurate model was both rewarding and troublesome. This was our 3rd or 4th time ever working with CoreML, so we were learning something new while also making something which can have a huge impact on our food production and farming.

What I learned

We learnt a ton about NLP and how it works. From understanding intents and entities to going about training our model in Python, building a ChatBot was far harder than I anticipated. We also learnt about the limits of NLP; when a bot is flooded with commands and different ideas, it has a hard time processing information and training, which is why we have to give narrow and straight-forward questions to allow our bot to perform adequately.

We also learnt a lot about UI frameworks such as SnapKit, which we had little experience working with prior. We had previously relied solely on UIKit, but using different frameworks was educational and will be helpful for any future apps. Building a clean UI seems like a daunting task, but simplicity and certain design concepts (Proper White Spacing, Color Blocking) allowed us to build a beautiful but functional user interface.

What's next for plantAId

In the future we will look at implementing a better centralized database system so farmers can keep track of thousands of plants with easier scans. On top of this, adding more features to the chatbot will allow farmers an incredible amount of information at the tip of their fingers and would be a good extension to this app.

A huge addition to PlantAId would of course be adding more content, such as more crops that can be analyzed along with diseases that the crops could be carrying. With the clean format we have created, adding new content such as new plants and diseases will be a much simpler task.

Disclaimer: Our app file size was larger than 35mb (DevPost limit). If you would like to access the App, please use our Github Repo instead.

+ 177 more
Share this project: