Inspiration

As a Biotechnology student, I have always been fascinated by sensory substitution the idea that the human brain can learn to process information from one sense through another. Traditional tools for the blind often feel like external attachments rather than an extension of the body. I wanted to build a "digital nerve" that could act as a standalone sensory system, turning a smartphone into a high speed peripheral for the brain. This project was born from the desire to give users back their "Proprioception" a natural sense of their body’s position in 3D space.

What it does

Glial is a silent first, precise navigation engine. Unlike typical assistants that talk constantly, Glial stays silent until a hazard is detected or a turn is required, preventing "audio fatigue." It uses the phone's 0.5x Ultra-Wide lens to provide a 120-degree field of vision, scanning from the floor to the ceiling simultaneously. It can identify traffic lights, find specific door handles, read text mode, and even perform a "Stuff Check" to ensure the user hasn't forgotten their keys or wallet before leaving.

How we built it

The core of Glial is the Gemini 3 API, chosen for its near instant spatial reasoning and multimodal processing. We bypassed standard camera settings to force the use of the Ultra Wide lens, giving the AI "peripheral vision." For macro navigation, we integrated Google Maps to provide seamless street level guidance. The entire experience is voice activated, allowing for a completely hands-free, "inch by inch" navigation loop.

Challenges we ran into

The biggest challenge was latency. In navigation, even a one second delay can be dangerous. We had to optimize our data packets combining images, depth metadata, and GPS coordinates to ensure Gemini 3 could return a safety command in milliseconds. Another challenge is the response time. How fast the response we can we did it.

Accomplishments that we're proud of

We successfully built a system that provides "inch by inch" precision using only standard mobile hardware. Achieving a functional 120 degree safety perimeter with the 0.5x lens was a major milestone. We are also proud of the "Silent First" architecture, which respects the user’s focus and only intervenes when safety is at risk, making it a truly practical daily tool rather than just a technical demo.

What we learned

This project taught us that for accessibility tech, "less is more." We learned that a blind user doesn't need a descriptive AI, they need a protective one. Designing for someone with zero vision forced us to rethink AI interaction prioritizing tactile feedback and urgent, short commands over long, friendly sentences. It reinforced the importance of "Sensory Substitution" as a design philosophy.

What's next for Glial

The next step is to expand the "Tactile Echo" into a full 360-degree auditory map using spatial audio, allowing users to "hear" the distance to objects in 3D. We are also exploring "Contextual Memory," where Glial can learn and remember the specific layouts of a user’s home or office to provide even more effortless navigation.

Built With

Share this project:

Updates