Inspiration

While working at the UCF ISUE Lab, I realized that traditional assistive devices for the visually impaired often lack spatial context. A cane can tell you there is a wall, but it can't tell you the texture of the ground or provide analytical data for behavioral research. Aegis was born to bridge the gap between reactive hardware and cloud-based spatial intelligence.

What it does

Aegis is a multimodal haptic shield that mounts to a standard baseball cap. It uses:Ultrasonic Sensors: For distance and relative velocity detection.MPU-6050 Gyroscope: For tri-zone pitch detection (Ground scan vs. Overhead).Spectral Sensors: For surface identification (Void detection/staircases).Haptic Tapper: A servo-driven feedback system that uses "Pattern Language"—different knocks for different threats.

How we built it

We architected a distributed IoT system:Edge: An ESP32-S3 running an Exponential Moving Average (EMA) filter to stabilize raw sensor data.Middleware: A FastAPI Python Bridge that ingests telemetry and calculates threat levels in real-time.Cloud: A MongoDB Atlas cluster for long-term telemetry storage and spatial analytics.Hardware: Custom 3D-printed brim mounts and a "Breadboard Backpack" for modular iteration.

Challenges we ran into

Integrating multiple $I^2C$ and spectral sensors on a mobile platform presented significant signal noise. We solved this by implementing digital signal processing (DSP) algorithms directly on the ESP32. Furthermore, managing real-time data ingestion over unstable venue Wi-Fi required a robust microservice bridge to ensure data integrity between the hat and MongoDB.

Accomplishments that we're proud of

Successfully implementing Relative Velocity logic—the system doesn't just buzz when a wall is there; it only alerts you if you are actively walking toward it, reducing "alert fatigue" for the user.

What we learned

We gained deep experience in Edge-to-Cloud pipelining and the importance of Signal Integrity in wearable robotics. We also learned how to manage asynchronous data flows in Python to keep the user experience lag-free.

What's next for Aegis

We plan to implement the Anomaly Detection ML models to identify "Near-Miss Falls" and integrate GPS coordinates. This would allow Aegis to create a "Hazard Heatmap" of the UCF campus, helping university planners identify areas that need better accessibility infrastructure.

Built With

Share this project:

Updates