INSPIRATION

Too many divers die every year from something as simple as task overload. Diving is one of the most dangerous professions in America, but it is also one of the most important. When I enter the water, even basic tasks become complex. I have to manage life support systems, navigation, communication, and precise mission duties all at once.

This cognitive overload can make even simple actions fatal, especially for new divers. That is why I built DiveBuddy.

WHAT IT DOES

DiveBuddy is a semi autonomous underwater submersible robot that I control through hand gestures because underwater I cannot speak.

It allows me to offload simple tasks, gather information, and connect with the surface when needed. Although divers operate in buddy pairs, in many professional and corporate diving settings each diver has their own responsibilities. I may not have time to answer questions like what species is this or analyze environmental data while also managing my own safety.

DiveBuddy reduces task loading so I can focus on mission critical operations and survival.

HOW I BUILT IT

I built the frame using cut PVC pipe.

I used 12 volt DC motors and waterproofed them with marine grease.

I wired the system using Ethernet cabling to maintain consistent signal transmission.

I integrated a camera for visual tracking and gesture recognition.

I programmed real time hand classification so the system can interpret commands such as UP, GO, STOP, LEFT, and RIGHT.

TECHNOLOGY

DiveBuddy runs on Python and uses computer vision and AI tools for real time interaction.

I used OpenCV for video capture and processing. I used MediaPipe for hand tracking and gesture classification. I used PySerial to communicate with the microcontroller controlling the motors. The system processes gestures, classifies movement commands, and transmits them through serial communication at 115200 baud.

I also integrated the ChatGPT API to provide higher level reasoning. While the robot handles movement and gesture recognition locally, the ChatGPT API acts as a cognitive assistant layer. When I trigger specific gestures, the system can simulate dive analysis, interpret telemetry data such as depth and temperature, and generate contextual insights such as identifying a species encounter. This transforms DiveBuddy from a remote controlled vehicle into an AI assisted diving companion that reduces both physical and cognitive task loading.

The system also simulates dive telemetry including depth, water temperature, and light levels to replicate a realistic underwater environment.

CHALLENGES I RAN INTO

Scope creep was one of my biggest challenges. I initially wanted everything to be fully autonomous, including advanced underwater AI and visual tracking.

I quickly learned the limits of time, money, and physical engineering constraints. Waterproofing electronics, stabilizing motors, and maintaining signal integrity underwater is extremely difficult. Underwater robotics is far more complex than building systems on land.

ACCOMPLISHMENTS I AM PROUD OF

I built a functioning underwater submersible under tight time constraints.

I implemented gesture based control in real time.

I successfully waterproofed motors and electrical systems.

I integrated AI reasoning through the ChatGPT API.

I created a working prototype that demonstrates real world application.

WHAT I LEARNED

Underwater robotics is extremely challenging. I now understand why military and research institutions invest heavily in this field. The underwater environment is unforgiving. Every design decision matters. Safety, redundancy, and simplicity are critical.

I also learned that reducing cognitive load can be just as powerful as increasing capability.

WHAT IS NEXT FOR DIVEBUDDY

I plan to integrate all systems into a fully sealed submersible housing.

I want to improve stability and control precision.

I will increase battery reliability and safety.

I plan to test the system in real diving conditions at Catalina Island.

I will continue refining the AI integration so DiveBuddy can better assist divers in high stress environments.

Share this project:

Updates