Inspiration

We were inspired by a universal struggle faced by new caregivers: the anxiety and helplessness that comes when a newborn cries and you don’t know why. A baby’s cry is their only form of communication, yet to an exhausted caregiver it often feels like a single, stressful alarm.

We wanted to bridge this communication gap by translating cries into something easier to understand. Our goal was to transform this panic-inducing sound into an intuitive, multi-sensory language that caregivers can quickly interpret, turning anxiety into confidence.

What it does

Little Voice is a conceptual system that interprets a baby’s cry and translates it into clear, intuitive signals.

Baby cries contain complex acoustic patterns that can indicate needs such as hunger, discomfort, or fatigue. The system analyzes these patterns and converts them into simple outputs—colors that the eye can read at a glance and gentle vibrations that the wrist can feel directly through a wearable device.

By combining cry analysis with contextual information such as feeding or sleeping records, the system provides caregivers with clearer clues about what the baby might need.

How we built it

This project was developed as a conceptual product prototype designed in Figma.

We created a visual and interaction system to simulate how the product might work across different interfaces, including wearable feedback and mobile UI. The design emphasizes a “calm technology” approach, using soft gradients, minimal interfaces, and clear visual signals to communicate information without overwhelming the user.

Photography, diagrams, and UI components were carefully arranged to create a consistent visual language. The typography pairs Inria Serif for headings with Inria Sans for interface information, balancing warmth with clarity.

Challenges

One challenge was representing a physical, multi-sensory experience—especially tactile vibration feedback—within a visual prototype. Since the concept relies on haptic signals from wearable devices, we needed to communicate these sensations through visual cues such as animated rings, pulse patterns, and rhythm diagrams.

Another challenge was finding the right emotional tone. Early explorations using black-and-white photos of crying babies felt too distressing. We refined the visual direction with softer lighting, warmer colors, and calmer imagery to ensure the product felt supportive rather than alarming.

Accomplishments

One outcome we are proud of is the cohesive visual system that unifies photography, diagrams, and interface elements into a calm and consistent experience.

We also developed minimal data visualizations that simplify complex cry-analysis concepts into clear visual signals and simple icon systems, making the idea easy for caregivers to understand at a glance.

What we learned

This project reinforced how important empathy is when designing for highly stressed users. Instead of adding more alerts or complex dashboards, technology should communicate information in subtle and intuitive ways—through gentle visual signals or ambient feedback.

We also learned how small visual decisions—such as lighting, color warmth, or typography—can strongly influence how trustworthy and comforting a product feels.

What’s next

The next step would be developing the concept into a functional prototype and exploring real acoustic datasets of infant cries to test the idea.

Future iterations could include wearable applications and more refined vibration patterns, allowing caregivers to recognize different baby needs simply through the rhythm of the vibration on their wrist.

Built With

  • figma
Share this project:

Updates