Inspiration: Many people experience sensory overload in environments that are loud, crowded, or visually intense. For individuals with sensory sensitivities, stress, or neurodivergent sensory processing, these situations can escalate quickly without much warning. The idea for Innercast came from thinking about how our bodies constantly give subtle signals before we become overwhelmed, but we rarely notice them in time.

Weather forecasts provide a helpful metaphor for understanding complex systems. Instead of showing raw atmospheric data, weather apps translate that information into simple forecasts like rain or storms. This project explores the idea that our sensory experiences could be understood in a similar way. Innercast imagines a future tool that turns invisible sensory signals into a clear “inner weather forecast,” helping people anticipate and prevent sensory overload before it happens.

What it does: Innercast is a speculative design tool that tracks and interprets sensory signals from both the user’s body and the surrounding environment. The system collects signals such as noise levels, light intensity, crowd density, heart rate variability, breathing patterns, and stress responses. These signals are combined with community data to detect patterns in sensory stimulation.

Instead of presenting raw data, the system translates these inputs into a sensory forecast that predicts how a user’s sensory load may change over time. Users can see current environmental factors, forecast timelines for potential overload, nearby calm environments, and recovery tools to help regulate their sensory state. The goal of Innercast is to help users understand their sensory environment and make proactive decisions that support their well-being.

How we built it We approached this project using a product design framework that started with identifying the core problem: sensory overload often occurs before people realize it, even though the body and environment are already providing signals. Our ideation process focused on how these invisible signals could be translated into a simple and actionable interface.

We began by mapping the types of signals that current or near-future technologies could realistically detect. These included environmental inputs such as noise levels (microphone), light intensity (ambient light sensors), and crowd density (Bluetooth device detection), as well as physiological signals like heart rate variability, breathing patterns, and stress responses captured through wearable devices such as smartwatches.

From there, we structured the product around a layered sensing framework consisting of three main data streams: body signals, environmental sensors, and community data. This architecture allowed us to think of the system not just as a tracker but as a predictive model that interprets patterns across multiple inputs.

During the interface design phase, we translated these complex data streams into a weather-based sensory forecasting system, which simplifies interpretation for users. Instead of presenting raw metrics, the system communicates sensory conditions through clear states such as clear, cloudy, rain, or storm. We then expanded the product ecosystem by designing features that support decision-making and behavioral change, including sensory radar for nearby environments, forecast timelines that predict overload risk, recovery tools that guide regulation, and personalized sensitivity profiles that adapt to individual users.

This framework allowed us to move from raw sensor inputs to a cohesive product experience that helps users anticipate and manage sensory overload in everyday environments.

Challenges we ran into: One of the main challenges was designing a system that communicates complex sensory information without overwhelming the user. Because the product addresses sensory overload, the interface itself needed to remain calm, simple, and easy to interpret. This required translating multiple streams of sensory data into clear visual cues and short insights rather than presenting raw metrics. A significant amount of time during the ideation phase was spent refining the visual language and deciding how to represent sensory conditions in a way that users could understand instantly.

Another challenge was designing a sensing framework that felt both realistic and forward-looking. Since Innercast is a speculative concept, we needed to carefully consider how the system would gather and interpret signals. This involved mapping potential inputs from wearable devices, smartphone sensors, and community-generated data, and thinking through how these sources could realistically work together. Integrating physiological signals, environmental measurements, and shared sensory information into one cohesive system required several iterations to ensure the concept felt believable and grounded in existing technologies.

Accomplishments that we're proud of: One accomplishment we are proud of is developing a clear metaphor that translates complex sensory data into something intuitive. The weather-based sensory forecast allows users to quickly understand their sensory conditions without needing to interpret raw sensor readings.

Another achievement was designing a system that combines multiple sensing layers, including wearable data, smartphone sensors, and anonymous community signals. This layered approach helps make the concept feel more realistic and shows how different types of information can work together to support user well-being.

What we learned: Through this project we learned how design can transform invisible biological signals into meaningful insights for users. We also learned how powerful metaphors can be in helping people understand complicated systems. The weather model made it possible to simplify sensory information while still preserving its meaning.

The project also reinforced the importance of human-centered design. Features such as recovery tools, support contacts, and calm environment suggestions were added to ensure that the system focuses not only on detecting problems but also on helping users respond to them.

What's next for InnerCast: Future versions of Innercast could expand the sensing system with more advanced wearable technologies that track additional physiological signals related to stress and sensory processing. Machine learning could help the system improve its predictions by learning individual patterns over time.

Another potential direction is integrating Innercast with physical environments, such as smart lighting systems or noise-canceling spaces that automatically adjust when sensory overload risk increases. Community participation could also help map calm zones in cities, allowing users to find sensory-friendly environments nearby.

Ultimately, Innercast explores a future where technology helps people better understand their internal states and environments, empowering them to manage sensory experiences before overload occurs.

Built With

  • and-visual-system-for-the-innercast-mobile-app.-figma-make-?-used-to-generate-and-prototype-screens-quickly-during-the-design-process.-figma-prototyping-?-used-to-simulate-user-interactions-such-as-navigation
  • figma
  • figmamake
  • interaction-flows
  • prototyping
  • sensory-forecast-changes
Share this project:

Updates