INSPIRATION
Creative professionals experience the world through a heightened sensitivity to visual signals. A color gradient in the sky, typography on a storefront, or the rhythm of ocean waves can all register as moments of inspiration.
In many ways, designers develop an intuitive sense for inspiration, a subtle recognition that something in their environment could become a future design reference.
Yet this sensory experience is fleeting and largely immeasurable.
Our project began with a simple observation: designers constantly perceive inspiring signals in the world around them, but capturing those moments is rarely intentional. We explored this idea through the persona of Chia, a designer whose creative instincts lead her to notice patterns, textures, colors, and motion in everyday environments.
Most designers today rely on photos, screenshots, or bookmarking tools to save inspiration. While helpful, these methods require deliberate action and often fail to preserve the context that made the moment meaningful in the first place.
Over time, inspiration becomes scattered across camera rolls, folders, and saved links, disconnected from the environments where it originally appeared. Because these moments occur spontaneously, they are rarely captured in a structured or organized way.
This led us to our central design question:
Can inspiration become a capturable sense?
Rather than treating inspiration as something designers must remember later, we began exploring whether it could be understood as a sensory signal, one that technology could detect, capture, and organize in real time.
WHAT IT DOES
Forage is a speculative wearable system that gives designers a new sensory capability: the ability to detect, capture, and analyze moments of creative inspiration directly from their physical worlds.
The wearable device senses environmental design signals such as:
- color relationships
- typography in physical spaces
- visual composition
- motion patterns (such as fabric movement, lighting shifts, or natural rhythms)
Instead of relying on memory alone, designers can instantly capture these moments using subtle gestures on their wearable devices.
Captured inspiration is then processed by an AI system that extracts meaningful design elements and converts them into structured creative reference files such as:
- color palettes and gradients
- typography and written copy detected in environments
- motion patterns and natural rhythms
- captured visual references and images
Over time, the system builds a personal record of when and where inspiration occurs, helping designers track the real-world moments that spark their ideas.
Instead of relying on memory or scattered photos, designers can return to these captured signals and directly pull them into their digital workflows as references for their projects.
In this way, Forage transforms everyday perception into a traceable creative process, allowing designers to move inspiration from the physical world into their design deliverables.
At its core, the system introduces a new measurable sense:
creative signal awareness, the ability to notice, track, and return to the moments that shape your ideas.
HOW WE BUILT IT
Our design process followed four main stages, combining research, speculative system design, hardware prototyping, and digital interface development.
Throughout the project we iterated rapidly between concept development, 3D product visualization, and interface prototyping to ensure that the wearable device, AI system, and digital inspiration library functioned as a cohesive experience.
PROBLEM EXPLORATION
We began by studying how designers currently perceive and collect inspiration, and where friction occurs between sensing a moment and actually capturing it.
Through observation and research, we identified a gap between noticing inspiring signals in the environment and successfully recording them.
Most existing tools such as screenshots, photos, and bookmarking platforms require deliberate effort that interrupts the moment of perception. They also capture only static images, often losing the environmental context including the light, motion, texture, or atmosphere that originally triggered the designer’s sense of inspiration.
To better understand this experience, we explored different types of creative professionals who regularly rely on environmental inspiration, including:
- graphic designers
- fashion designers
- product designers
- visual artists
Looking across these disciplines revealed a common pattern. Creative professionals develop a heightened sensitivity to certain visual and environmental cues such as colors, textures, typography, motion, and composition that others might overlook.
Through this research, we reframed inspiration not as a memory or organization problem, but as a sensory phenomenon, a set of environmental signals that designers instinctively perceive and respond to in the world around them.
CONCEPT DEVELOPMENT
From our research we developed the concept of creative sensing, inspired by the quantified-self movement.
Just as fitness trackers measure physical activity and health signals, we began to imagine a system that could detect and record the environmental cues that trigger creative inspiration.
To guide the system design, we modeled inspiration capture conceptually as:
Where:
- visual signals such as composition, imagery, and scenes
- typographic signals including letterforms and written copy
- motion signals such as rhythm, flow, and movement
- color relationships and gradients
This framework helped us think about inspiration not as a single event, but as a combination of sensory inputs that designers perceive in their environments.
From there, we began mapping how a system might capture and translate these signals into usable creative references.
This led to an architecture composed of three core components:
- a wearable sensing device
- an AI processing system
- a digital inspiration interface
Together, these components allow inspiration to move from real-world sensory perception to structured digital references designers can return to in their creative work.
INTERACTION DESIGN AND PRODUCT VISUALIZATION
To explore how designers might physically interact with a system that senses inspiration, we developed a wearable device concept designed to capture environmental signals instantly without requiring screens or complex interaction.
We designed the physical form of the device using Rhino 3D, allowing us to prototype the scale, ergonomics, and visual identity of the wearable.
Multiple iterations of the 3D model helped us refine:
- how the device sits naturally on the hand or wrist
- the placement of sensors and camera elements for environmental sensing
- gesture accessibility for quick, intuitive capture
- the overall form factor to remain subtle and comfortable during everyday use
Rather than designing the device to resemble traditional technology hardware, we intentionally shaped it to feel more like a fashionable piece of jewelry. Designers often express identity through accessories, so the wearable blends naturally into personal style.
Key interaction elements include:
- gesture-based input
- subtle haptic confirmation when inspiration is captured
- passive environmental sensing
These interactions allow designers to capture inspiration in under a second without interrupting their observation of the environment.
AI PROCESSING SYSTEM & DIGITAL PROTOTYPE
Captured environmental signals are interpreted through a conceptual AI pipeline designed to translate real-world perception into structured design references.
Rather than simply storing images, the system analyzes the visual cues that triggered the designer’s inspiration such as color relationships, typography, motion, or composition.
The system:
- detects visual patterns in the selected environment
- identifies the type of design signal present
- extracts structured design data
- organizes these elements into a searchable archive of inspiration
To explore how designers interact with these signals, we built a fully interactive prototype using Figma.
The prototype can be explored here:
The interface includes three primary surfaces:
- Field, the personal capture library where inspiration is stored
- Boards, collections for organizing signals into project directions
- Explore, external inspiration sources
Gesture-based capture allows designers to translate inspiration into structured signals:
- single tap → image capture (PNG/JPG)
- two-finger tap → color palette extraction (HEX values)
- two-finger drag across text → typography capture with OCR
- tap + hold drag → motion capture stored as MP4 with GIF preview
As more inspiration is captured, the system begins surfacing patterns across the designer’s archive, revealing which environments or signals most consistently trigger creativity.
Over time, scattered moments of inspiration become a structured record of creative sensing.
CHALLENGES WE RAN INTO
One major challenge was translating a subjective concept like inspiration into something measurable.
Inspiration is emotional and contextual, making it difficult to define purely as data. We addressed this by focusing on environmental signals commonly associated with inspiration.
Another challenge was designing an interaction model that remained nearly invisible to the user.
Because the system is wearable, interactions needed to feel effortless and non-disruptive.
We also had to carefully balance hardware sensing and AI processing, determining which tasks should occur on-device versus in the AI system.
Finally, we prioritized pattern detection and meaningful insights instead of overwhelming users with raw inspiration data.
ACCOMPLISHMENTS WE’RE PROUD OF
One accomplishment we are particularly proud of is defining creative inspiration as a sensory system rather than a documentation problem.
We are also proud of the wearable interaction model, which allows designers to capture inspiration instantly through subtle gestures and haptic feedback.
Another accomplishment is the conceptual AI pipeline that translates environmental signals into structured design references including:
- image captures
- typography references
- color palettes
- motion patterns
Using Figma, we built a fully interactive prototype demonstrating how captured signals move from environmental sensing into organized digital archives.
WHAT WE LEARNED
Through this project we learned that inspiration is deeply tied to sensory awareness of the environment.
Designers rarely search intentionally for inspiration. Instead, it appears through everyday experiences.
We also learned that creative tools are most powerful when they support observation rather than interrupt it.
Speculative design allowed us to rethink creative workflows entirely by imagining new sensory capabilities.
WHAT’S NEXT FOR FORAGE
Forage imagines a future where creative tools extend beyond screens and into the environments designers move through every day.
Future capture modes could include:
- vector shapes
- surface textures
- material patterns
- spatial compositions
Voice interaction could allow designers to trigger captures or annotate inspiration hands-free.
AI could also begin generating variations based on recurring patterns in a designer’s archive.
Over time the system could enable:
- real-time detection of inspiration patterns
- AI-assisted design variations
- collaborative inspiration sharing
- environmental inspiration mapping across cities
- deeper AI understanding of personal creative preferences
Ultimately, Forage explores a future where designers no longer search for inspiration later.
Instead, they sense, capture, interpret, and evolve inspiration as it happens.
In that future, the world itself becomes a living design library.
Built With
- adobe-illustrator
- aftereffects
- chatgpt
- figma
- figmamake
- gemini
- google-docs
- photoshop
- premierepro
- rhino3d


Log in or sign up for Devpost to join the conversation.