Inspiration

As some airplanes adopt self-driving systems, some will remain manually controlled. Since all aircraft cannot adopt synchronized self-driving systems simultaneously, we need software to help us transition into this new technology to prevent accidents during taxi. Aircraft Marshalls direct aircraft with hand motions, so we used computer vision to translate these signals into maneuvering instructions.

What it does

AeroVision lets you control a VIAM rover with just simple hand gestures. It can move forward, move backward, turn right, turn left, and stop. Just like Aircraft Marshalls, it uses standard marshalling signals and converts them to robot output/movement.

How we built it

We used VIAM's app, API, and Python SDK to make a VIAM rover respond to hand signals. Then, we used an OpenCV model to track our hands on the webcam by each frame. Using the hand tracking model, we can make automated decisions for the VIAM rover to move using its wheels.

Challenges we ran into

We ran into many challenges during this hackathon, one including the VIAM rover. Since we were new to their system and network for running their machine, we had to adapt to their API and hardware. However, the VIAM team was able to help us every step of the way.

Accomplishments that we're proud of

Being able to integrate OpenCV with the VIAM rover

What we learned

How to use the VIAM rover, and implementing their API using the python SDK.

What's next for AeroVision

Better and more accurate tracking, more features with the OpenCV model. We are thinking about implementing this for Aerovision Pro Max Deluxe.

Built With

Share this project:

Updates