Image 3 Image recognition
Image 1 Robot
Image 6 Arduino101 setup 2
Image 5 Arduino101 setup 1
Image 7 Application in cloud
Image 2 Robot inside
Image 8 text to speech
Image 4 Arduino101
Our solution is about water conservation, it is based on a volunteer project called Poseidon. With our concept build with an Arduino101, you optimize the circumstances to grow plants effectively. Most important of all, you can save enormous amounts of water. With the use of sensors like a soil moisture sensor, you always give the plant the right amount of water, not too much or too less. We added some extra sensors to measure more parameters to see if the environment around the plant has influence on the moisture level of the soil.
You can extend this concept to whole fields of crops where you measure all kinds of parameters to help your crops grow better with exact amount of water and conserve more water.
What it does
Our solution consists of several components which are working together and are connected. In this section, a step-by-step description is made. In the next paragraph: “##How we built it”, we explain the steps more in detail
- A picture of the plant is taken and analysis of this picture to determine the plant and the recommended soil moisture level belonging to that plant 2 Reading sensor data including soil moisture and compare it with the recommended soil moisture level 3 Sending real-time data from the sensors to a dashboard and a mobile device 4 Speaking out a welcome message when the owner of the plant approaches the plant and speaking out and tweeting the status of the plant 5 Storing data in database for further analytics on the influence of different parameters around a plant.
How we built it
Step 1. Because different plants need different parameters to grow in the most efficient way, we start with taking a picture of a plant. To make that happen we created a simple robot ( link )which is based on a Raspberry PI. This robot is connected via WIFI and has a speaker and a camera attached. See image 1 and 2 for the robot. On this robot, we run a Node-RED application (Node-RED is an open source programming tool based on node.js.). With this application, we control the camera to take a picture. In the same application, we analyse this picture to determine the type of plant with the use of the Watson Visual Recognition API. With this API, a database is available with a lot of objects, including plants. When we send the picture to the API, the result of this API is a list of objects which are recognized with a certain confidence level. The object with the highest confidence level is spoken out via the speaker. In our case the plant is always recognized with the highest confidence. See image 3 for the application. When we know the type of plant we know the right soil moisture level.
Step 2. The heart of our solution is the Arduino101 (see Image 4) with connected sensors to measure data from the plant. The sensors are: Soil Moisture sensor, Temperature sensor, Light sensor, Air pressure sensor. They are all connected with a Grove board to the Arduino101. See image 5 and 6 for the setup. For these sensors, we used different Arduino libraries to read the sensor data. We send some of the sensor data (soil moisture) via the BLE functionality of the Arduino101 to a mobile device. We also combined all the data and send this via MQTT (with an Arduino Wi-Fi-shield) to our Node-RED application running on Bluemix. Bluemix is a cloud developer platform, consisting of different APi’s which we can use for free. See image 7 for this application.
This Node-RED application compares the soil moisture level belonging to the plant with the measured soil moisture level coming from the Arduino101. If the soil moisture level is in the right range or it’s too low, this result will be send via MQTT to the robot to be spoken out.
Step 3. To use the retrieved data from the connected sensors, we send the data in two directions:
A. With BLE to a mobile device to see the real-time status of the plant. When the owner of the plant approaches the plant with a mobile device. The Arduino101 automatically connects with this device. In the nrftoolbox app a graph is then visible with the real-time data (soil moisture level) from the plant. Another thing that happens when a connection is made, a signal is send from the Arduino101 via Node-RED in the cloud to Node-RED on the Raspberry Pi (all via MQTT) to speak out a welcome message and the status of the plant. See image 8 for this part of the application.
B. Via MQTT over the connected WIFI-shield, to our Node-RED application in the cloud. We setup the MQTT connection to a broker on the Arduino101 based on PubSub to Node-RED which runs in the cloud. We deployed the Node-RED boilerplate ( [link]https://console.bluemix.net/catalog/starters/node-red-starter?env_id=ibm:yp:eu-gb) ) in Bluemix which we used as a starting point for our application This Node-RED application adapts the data to make it visible on a dashboard. We made a MQTT connection to the same broker as where the Arduino101 is connected to with the same topic. Then the data automatically flows into the Node-RED application. We created a JSON string of the dataflow and then parsed the right data to the right graph on the dashboard. We used the Node-RED dashboard extension feature for that.
This JSON string also contains the soil moisture value and the application determines if the value is in the right range or is too low. The result (right or low) is send to MQTT broker with a different topic. The robot is also connected to this topic and receives the value.
Step 4 When someone with a mobile device comes in reach of the BLE signal of the Arduino101, the Arduino101 will automatically connect to this device. The Arduino101 will send the welcome text to a MQTT broker with a certain topic. The Node-RED application in the cloud receives this text via this MQTT broker and topic and forwards this via MQTT to the robot and will be spoken out via the Watson Text-to-Speech API. The welcome text is also send to Twitter.
Step 5 We also store the data in a no-SQL database (Cloudant). We do this in the Node-RED application in the cloud, after we added a time stamp to it. We do this for further analysis to see if measured sensor data has an influence on the growth of the plant.
Challenges we ran into
We had a lot of interference between the different sensors, MQTT and BLE and the WIFI-Shield. Sometimes the MQTT connection was unstable. What we did is change the order and changed pin’s to get the best results. We wanted to use the air quality sensor as well, but we did not get it to work properly, to get the right data format. In some cases we had to move some code to void.loop() to get things running smoothly.
Accomplishments that we’re proud of
We are proud that we got this complex setup working in the way we wanted, it is probably not the most efficient way, but it works! When we started we even did not know about the existence of the Arduino101 board and now we know all ins- and outs of it and how to combine it with Grove, sensors, APIi’s etc. We are thinking of other projects where we could also use the Arduino101.
What we learned
We learned that the BLE feature can be very helpful to extend an app or to add extra features to an app. It is very easy to use with the provide examples. Almost all needed libraries are available for the board and sensors. We found that the Arduino101 is compatible with the grove board and most sensors.
What's next for Water conservation with a cognitive plant
Next step is to optimize the code and solution and to extend this solution with more sensors, like measuring the minerals in the soil, etc. to gain more data and do more extensive analytics. In this case, you can determine the optimal environment for the plant to grow. Another thing we like to do, is adding a valve to the solution to automatically water the plant. Finally, we would like to extend our solution to a field of crops, with more sensors in the ground to see if we can optimize growth of a field of crops as well. This way we could really make a difference in saving water!