Inspiration

We’re taught to stay aware when walking alone - keep an earbud out, watch your surroundings. But human attention is limited. You can’t remember what someone looked like two minutes ago, and constantly checking behind you is exhausting. Vigil exists to extend situational awareness, not replace it.

What Vigil Does

Vigil is an AI-powered "peripheral vision" system geared toward commuters, which analyzes rear-facing video to identify potential followers based on appearance patterns (height, build, clothing).

How I built it

The architecture uses a sequential processing queue to handle async detection and a consolidation system for duplicate tracking. For vision, I used Overshoot SDK for real-time video analysis with LLM vision models. For matching, I created a custom semantic matching algorithm with weighted scoring (color families, compound tokens, bigram matching).

Challenges I ran into

It was definitely a struggle to categorize new people versus repeating people in the camera feed. Because my audience consists of commuters, I challenged myself to work with variable lighting, which posed many problems for a vision-based product.

Accomplishments that I'm proud of

I'm pretty proud of the similarity score system. It's not perfect, but I find its current state quite satisfying.

What I learned

Get a team! I don't regret going solo for my first hackathon because my goal was to learn to develop- and I'm pretty sure I accomplished that. But I was busy, anxious, and didn't get to meet many new people, which is a key part of the whole experience. Next time, for sure.

What's next for Vigil

Further refinement, research into mobile apps and hardware devices, and potentially exploring additional methods to accomplish my goal of a "passive, constant safety device."

Built With

Share this project:

Updates