Inspiration
We were inspired by the talk by John Deere, and their thoughts about the history and future of how technology evolves. In the end UIUC is a school built on this evolution, especially in the field of agriculture, and the enchantment of agriculture via technology is something that feeds billions and allows for our current population to thrive, but the mass spraying of herbicides can be detrimental to our ecosystem and human health, so proper weed management is at the forefront of what needs to be done.
What it does
Our autonomous vehicle navigates a region and detects obstacles in its path. It then identifies the object and destroys it if it's a weed and avoids it otherwise. The region to navigate as well as the types of plants to avoid are specified by the user to specialize the vehicle for each task.
How we built it
We used TensorFlow along with a training dataset and the images the user submits to train a MobileNetV2 model to categorize weeds and other plants. This is done by first training the top layer of the model with the base training set and then training the first 100 layers at a smaller learning coefficient with the dataset augmented with user images. The performance of the model is tested against a separate dataset.
The device itself was built out of a Raspberry Pi 4, a webcam, an ultrasonic sensor, 2 motors, a battery pack, a custom Pi Hat, and 3d printed parts designed in Autodesk Inventor. The on-vehicle code is written in Python and uses the OpenCV library alongside starter vehicle control code provided by John Deere.
Challenges we ran into
When integrating the categorization model into the vehicle, we encountered an issue where the accuracy of the model dropped off significantly due to noise in the webcam frame. We took several steps to mitigate this issue including preprocessing webcam data and adjusting the configuration of the webcam relative to the position of our vehicle.
Another big challenge that we faced was motor balance. Since we did not have encoders for the motor, the only adjustments that we could do to balance the motors was to differentiate the motor speeds. However, this was often inconsistent because small changes to the wheels and distribution of weight on the device changed the balance, leading the device to veer unless the motor speeds were recalibrated.
Accomplishments that we're proud of
We were able to train a categorization model with an average 90% accuracy in differentiating weeds and plants.
We were able to integrate the AI categorization and multiple sensors to make the device autonomous.
What we learned
We learned the importance of sleep and taking breaks in the design process.
We also learned the challenges and intricacies involved in integrating multiple technologies to solve a real-world problem.
What's next for Weed Slasher
Complete robust integration of all aspects of the project and a more advanced weed removal mechanism would elevate the project to the next level. More sophisticated image processing and diversified datasets would also improve the accuracy and reliability of the autonomous aspects of the vehicle.
Thank you so much to John Deere for providing the hardware that made this project possible.

Log in or sign up for Devpost to join the conversation.