The commonly available Arduino Uno is not powerful enough to run or train a machine learning model. Hence, we decided to connect our model running with python on our computers to our Uno board running Arduino code to see if we could sense gestures and do real-life things with it, such as control the brightness of an LED, or control a motor, or eventually even a robot.

What it does

The machine learning model we use can sense around 4 gestures: peace, okay, palm, and 'L'. We take a video stream from the computer's webcam and run each frame through model to classify each frame into the 4 gestures. Depending on which gesture the camera model senses, a command is sent out to the connected Arduino.

In this case, the commands sent out will change the brightness of an LED using PWM. This is just a simple representation of what we can do. We can connect multiple devices to the Arduino and issue different types of commands to make it do virtually anything. We can also send commands back to the computer from the Arduino to create a makeshift control surface for a PC application.

How we built it

We first looked for machine learning models on the web pertaining to hand gestures and decided to pick up a ready-made model to speed up our design process. The main challenge was in taking the frames from the laptop's webcam using cv2 and processing it though the model. We created a simple Arduino circuit with an led connected to a PWM pin through a resistor. In the Arduino code, we defined a simple protocol which changes the brightness of the led based on commands received from the COM port.

On the laptop side, we used the pyserial library to send commands whenever we got a successful classification from the machine learning model.

Challenges we ran into

  • Integrating hardware and was a challenge, since we have never approached it in this way.
  • Creating a video stream from the laptop's webcam and feeding it through the model was a tricky problem, as we were doing it in realtime.

Accomplishments that we're proud of

  • Establishing a protocol to convey commands to the Arduino.
  • Establishing a realtime video feed from the webcam and feeding it through the model.

What we learned

Integration between hardware and software using the Serial communication. Realtime classification using keras.

What's next for Aiduino

Expanding the communication protocol to include more devices.

Built With

Share this project: