Inspiration

Last year our friend got into a car accident and lost his vision. As we helped him integrate back into daily life, we saw first-hand how existing visual-aid devices fell short. We also heard the same frustrations from others with visual impairment: tools like canes and service dogs demand significant conscious effort and still miss key capabilities like spatial and context awareness. We dug into the literature and found compelling neuroscience showing that tactile perception can be trained to compensate for visual deficits. That insight inspired our hackathon project: build a tactile auxiliary device aimed at closing the spatial/context awareness gap.

What it does

Argus Touch acts like a tactile compass for nearby people:

  • Bearing (servo columns → angle): For each detected object we compute polar coordinates (bearing, distance). The field of view is split into angular “slices,” and the servo column for that slice presses—so it simply feels like detected objects are felt from the directions “left", "front-left", "front-right", "right" etc.
  • Class (servo rows → object type): The detector labels each object in real time, and we map that label to a servo row. Distinct row activations/patterns represent different classes and one can be trained to become fluent in identifying different types of objects
  • Distance (servo pressure → proximity): Pressure scales with nearness: the servo presses more firmly as an object approaches, giving an immediate, intuitive near/far signal.

This creates a low-load “sixth sense” for orientation while keeping hands and ears free.

How it works

Perception. A binocular depth camera provides RGB, IR, and full XYZ point-cloud data. A YOLO-11 detector identifies people in real time and estimates bearing and distance via depth/angle binning.

Mapping. Angular bins map to 16 actuators around the waist (physically a 4×4 arrangement; columns correspond to view sectors). Distance converts to pressure using a smooth curve with per-channel caps and rate limits for comfort.

Actuation. A Raspberry Pi 5 drives actuators via a PCA9685 PWM controller. For the hackathon, inference runs on a laptop to maintain ~30 FPS end-to-end responsiveness.

Live HUD. The demo overlay shows detections, depth, per-servo activation, and timing (median/p95) to make video capture→inference→actuation latency visible.

Challenges

  • PCA9685 holding-current bug: After “stopping” commands, servos still drew high current (holding torque) because the PCA9685 outputs weren’t being released. Fix: add a release() that writes neutral pulses, disables all channels, clears/flushes registers, and sets SLEEP/OE—dropping current and preventing gear strain.
  • We spend much of our time debugging our device so couldn't implement all of the sponsors we hoped to integrate into our project. In the end we focused on one sponsor and made sure the device functions as intended.

What’s next

Our end goal is haptic spatial and context awareness that extends beyond a single class.

Short-term (engineering):

  • Deploy smaller/quantized detectors (e.g., YOLO-N/-S, INT8) for true on-body inference.
  • Explore more actuation points and advanced vibrotactile hardware to expand the tactile vocabulary while keeping load low.
  • Refine industrial design: slimmer, lighter belt with robust mounting and cabling.

Medium-term (capabilities):

  • Expand to additional object classes (doorways, obstacles, furniture) and basic contextual cues (e.g., moving vs. stationary).
  • Add personalization (per-user sensitivity curves and comfort calibration).
  • Optional on-demand audio summaries that don’t compete with everyday listening.

Long-term (safety & context, research):

  • Investigate fall/abnormal-event detection and assistive alerts as separate, opt-in modules.
  • Migrate to on-body compute (edge SoC) for fully mobile use.

Impact

Millions live with visual impairment in the U.S. alone. Argus Touch aims to complement canes and guide dogs with a quiet, always-available sense of “where” and “how near,” pushing toward tactile context that’s trainable, learnable, and low effort. With more time and resources, we’ll broaden the object vocabulary and refine the hardware until this tactile channel becomes a dependable everyday companion.

For the curious: Why “Argus Touch”?

In Greek myth, Argus Panoptes—“the all-seeing”—was a guardian with a hundred eyes who kept constant watch for Hera (famously over Io). After Hermes lulled and slew Argus, Hera placed his eyes on the peacock’s tail. Our belt isn’t literal sight, but the metaphor fits: many tiny “eyes” become many gentle touches, turning distributed awareness into a quiet sixth sense.

Built With

  • adafruit
  • livekit
  • onnx-runtime
  • orbbec
  • pca9685-pwm
  • point-cloud
  • python
  • raspberry-pi-5
  • redbull
  • sg90
  • yolo
Share this project:

Updates