Inspiration
The inspiration for Steggy stems from the realization that traditional smart home interfaces—screens, voice assistants, and complex apps—are often inaccessible or overwhelming for children with mental or non-verbal disabilities. We wanted to create a companion that bridges the gap between the digital home and the physical world through comfort and intuitive interaction. By turning a familiar plushie into a "magic wand" and a guardian, we aim to give children agency over their environment without the need for spoken words.
What it does
Steggy is an assistive smart home ecosystem centered around a tactile plushie.
Gesture Control: Using motion-sensing technology, a child can point or gesture with the plushie toward smart devices (like lamps or fans) to toggle them, making home automation a physical, play-based experience.
Smart Sleep Integration: Steggy monitors sleep patterns through embedded sensors. It can gently transition a child into their morning routine by slowly opening curtains or adjusting lights based on their natural waking state.
Non-Verbal Communication: It provides a low-friction way for children to communicate needs to the home environment, reducing frustration and promoting independence.
How we built it
- ESP32s (thank you espressif)
- sensors (thank you analog devices)
- male to male to female to female cables
- a lot of servos
- a trip to microcenter because hardware booth didn't have our stuff
Challenges we ran into
we have no idea what we're doing
Accomplishments that we're proud of
We are incredibly proud of creating a functional bridge between high-tech IoT and high-touch comfort. Successfully mapping complex gesture data to simple home commands felt like "magic" during testing. Moreover, designing a system that prioritizes the specific sensory needs of non-verbal children—avoiding loud noises or jarring lights—was a major milestone in our inclusive design process.
What we learned
This project taught us the importance of Edge Computing; processing motion data locally on the ESP32 is vital for the real-time responsiveness these children require. We also learned that "smart" doesn't have to mean "complex." By stripping away the UI and focusing on tactile feedback and movement, we discovered that accessibility often leads to more intuitive design for everyone.
What's next for Steggy
Next up is training Steggy to recognize more specific, personalized movements—basically a customized "language" of gestures for each kid. We’re also looking at adding some haptic feedback (gentle purring or heartbeats) to help with grounding during meltdowns. Steggy is just getting started.
Built With
- analogsensors
- esp32


Log in or sign up for Devpost to join the conversation.