Inspiration

We noticed that the best presentations are the ones where the speaker is actually moving around and talking to the audience, not stuck behind a laptop clicking through slides. We wanted to push toward a future where you can control your computer just by moving. That idea turned into Sea Swipe, a game that helps us collect the gesture data we need to make that possible.

What it does

Sea Swipe is a gesture‑control detection system that lets you control your computer using only a camera. Right now, we’re using it for hands‑free presentation control so presenters can walk around, stay engaged, and never worry about a clicker. We also built a small swipe‑based game, not as the main product, but as a fun way to collect the gesture data we need. That data will eventually help us move away from hardcoded rules and train a full ML model, giving Sea Swipe the potential to grow into a much more powerful gesture‑interaction tool in the future.

How we built it

We built Sea Swipe using Python for all of the motion tracking. MediaPipe handles the body‑landmark detection, and we use PyAutoGUI to actually interact with the computer so we can control presentations hands‑free. For now, we hard‑coded the rules for left and right swipe gestures using either arm, which works well enough to demo the idea.

We also built a small side game to help us collect gesture data. The UI/UX for the game is done in R because it let us build the interface quickly, and the backend runs in Python so all the motion tracking and data logging stays consistent. The game isn’t the main product, it’s just our way of gathering the data we need to eventually train a real ML model.

Challenges we ran into

We ran into a lot of technical challenges while building Sea Swipe. Getting all the code to actually work together took way more time than expected. Integrating API keys for Gemini and Human Delta wasn’t straightforward, and getting consistent computer‑vision gesture detection was its own challenge. We also had to figure out how to connect Python to R so the game UI could talk to our backend. All of these pieces worked in different ways, so getting them to communicate smoothly was definitely one of the hardest parts.

Accomplishments that we're proud of

We were able to successfully build a swipe‑gesture detection system that works with basically any presentation software, all using just a camera and no extra hardware. We also managed to integrate a fun side game that helps us gather real ML training data without forcing people to do boring, repetitive motions. On top of that, we’re really proud of the UI/UX we built for the game and the fact that we hooked it up to both the Gemini API and the Human Delta API to generate unique questions every time you play. That means we don’t have to hard‑code our own questions or answers, and the game stays fresh no matter how many times someone tests it.

What we learned

This was the first hackathon for most of our team, and we only had one experienced member going in. For two of us, it was our first time being part of an actual team coding project, so we had to learn how to split tasks, communicate, and build different parts of the system at the same time. We also got real experience using GitHub as a team instead of individually, and we were able to successfully develop our own components and merge everything together into one working project. It was a huge learning experience for all of us.

What's next for Sea Swipe

Our next step is to release the Sea Slide game to the public so we can start collecting real swipe‑gesture data from a wide range of players. The more real‑world data we get, the better our ML model will become once we move past hard‑coded rules. We also want to work with Google Slides and explore integrating our hands‑free presentation‑control system directly into their platform so anyone can present without a clicker.

On top of that, we realized Sea Swipe has potential way beyond presentations. With accurate camera‑based gesture detection, a regular computer could basically act like an interactive console, similar to the Wii or Nintendo Switch, but without any controllers at all. This could open the door to a new era of motion‑based video game technology powered entirely by a webcam.

Built With

Share this project:

Updates