Inspiration
Growing up in rural areas gave us a deeper understanding of the agricultural challenges faced by thousands of farmers. Pesticides and other harmful chemicals help fend off invasive species, but at what cost? We began to wonder how we could tackle this problem and create a solution that not only simplifies invasive species’ detection, but streamlines agricultural processes.
What it does
Iris is an autonomous quadrupedal robot that can walk around farmland and agricultural areas, and using computer vision, detect and interact with weeds and invasive plant species that could harm crops and agricultural outputs from farms. With strong performance in its detections, it can be deployed at farms across the country, greatly saving time and energy for busy farmers, improving agricultural production in a safe and sustainable manner, especially replacing the need for widespread use of pesticides in our crops.
How we built it
We built our autonomous robot system using the ROS2 framework with Python and C++. Our computer vision model was a MobileNet architecture built from scratch with Tensorflow Keras, trained by us on the DeepWeeds dataset from Nature.com. To enable our simulation, we utilized the champ library, providing models of robots and integration with a Gazebo simulation environment. Integrating our ROS2 code with champ, and feeding the video stream from the robot’s camera to our computer vision model, we were able to simulate the performance of our Iris robot in a real life agricultural environment, and if we had access to a real robot, would have been able to connect to it and see its actions in real life. Finally, our simulation and development required a Linux environment, in which we had to develop our own Docker container to run from an Ubuntu 22.04 image.
Challenges we ran into
In order to access a Linux workspace for ROS2, we had to develop a custom Docker container. However, we faced challenges while adapting our container for compatibility for use on both Windows and Macs. Additionally, the Boston Dynamics robot, Spot, struggled with localization of position in our simulation. We also encountered slight performance issues due to the complexity of the simulation without access to better compute.
Accomplishments that we're proud of
We’re very proud of the performance of our Iris robot in the simulated environment, and the achievement of building an autonomous robot that can move, see, and detect invasive species of plants from a surrounding environment. The amount of setup and configuration, especially given the multiple different operating systems we each had, were a lot, and especially with the development of a robotic system, but we are glad that we were able to work together to solve each of the issues that came up due to this and delegate based on the capabilities of each of our personal computers. It was a difficult challenge that we set for ourselves, and we are very proud of what we were able to achieve in the limited amount of time we were given.
What we learned
Some of the coolest things that we learned from this project include how to set up a digital simulation environment for robots, how to integrate a computer vision model into a ROS2 system and connect it into a live video stream, and how to set up our own ROS2 environment on a custom Linux Docker image.
What's next for Iris
Future improvements to Iris include extending its capabilities beyond invasive species detection to also autonomously cut and collect invasive plants, further reducing the need for manual labor, time, and chemical interventions such as pesticides. We hope one day to gain access to a real robot, connect our code to the robot, and demonstrate its capabilities in a real life scenario.
Log in or sign up for Devpost to join the conversation.