Inspiration

In a future where learning, decision-making, and daily activities are increasingly digital and independent of location, people often feel isolated even while interacting in shared knowledge spaces. We imagined a system that allows users to sense and share emotional experiences during these activities, helping them feel connected to others even when physically alone.

Our inspiration came from the idea that human experiences are more meaningful when shared. Whether reading an article or making a financial decision, people naturally seek reassurance and understanding. Emotional Telepresence envisions a future where technology enables these shared experiences by connecting people through the emotions they feel in the moment.


What it does

Emotional Telepresence enhances digital experiences by allowing users to see and feel how others reacted in the same moment. While reading, a neurochip tracks eye movement, heartbeat, and emotional signals. When a user pauses or shows signs of strong engagement, the system detects the emotion and surfaces shared experiences from other users who felt the same way.

Users can listen to voluntary voice notes from others, explore reflections, and experience subtle sound cues that recreate the emotional atmosphere. The emotional map visualizes reactions, with bubble size representing how many users experienced each emotion.

Beyond learning, this feature applies to other contexts, such as financial decision-making. For example, a user feeling confused about which company to invest in can see that others felt similarly and listen to their reflections, reducing isolation and providing social validation during challenging decisions.


How we built it

We designed the system as a speculative concept combining emotion-sensing technology, human-computer interaction, and immersive digital environments. A neurochip monitors physiological signals—eye movement, heartbeat, and other indicators—detecting moments of heightened engagement.

Conceptually, emotional state is represented as:

[ E = f(H, P, S) ]

where
(H) = heart rate variation,
(P) = pause duration on content,
(S) = other physiological signals.

The system then retrieves anonymized reflections from other users and presents them via voice notes, emotional sound cues, and a visual emotional map. Users can apply filters for location, time range, and preferred modes to customize the experience.


Challenges we ran into

Designing a system that enhances connection without being intrusive was a major challenge. We needed to determine when and how to surface shared emotional experiences so that it supports rather than interrupts user focus.

Privacy was another challenge. Tracking emotions can feel invasive, so we relied solely on voluntary voice notes instead of accessing thoughts directly. Visualizing emotional data in a way that is intuitive and meaningful also required multiple iterations to ensure clarity and usability.

We also wanted to implement eye-tracking to determine exactly what the user is reading and trigger responses based on that in real time. However, limitations with the current system and available tools prevented us from fully integrating this functionality, which remains a key opportunity for future development.


Accomplishments that we're proud of

We are proud of creating a concept that transforms solitary activities—like reading or financial decision-making—into shared emotional experiences. The emotional map and voice note integration allow users to feel connected to a community even when physically alone.

We also designed the system with privacy and consent at its core, ensuring that emotional sharing is voluntary and controlled by the user. Finally, we successfully applied the concept beyond reading, showing its potential in areas like investment decisions and other high-stakes scenarios.


What we learned

We learned that emotional awareness can profoundly influence how users engage with content and make decisions. Shared emotional experiences provide validation, reassurance, and a sense of social connection, which is especially important in remote or isolated contexts.

We also discovered the importance of ethical design in emotion-sensing systems, balancing utility with user control and privacy. Speculative design helped us imagine not only what technology could do, but also how it could meaningfully impact human experiences in the future.


What's next

The next steps involve exploring how we can manipulate ambient environmental cues to make the experience more immersive and engaging. This includes experimenting with:

  • Visual cues – subtle lighting changes, color shifts, or depth effects to reflect the emotional state.
  • Spatial and 3D audio – soundscapes that convey the intensity or type of emotion others are feeling.
  • Haptic feedback – vibrations or gentle pressure to mirror emotional intensity or heartbeat patterns.
  • Chronoception cues – manipulating the perception of time to match the pace of others’ emotional experiences.

Additionally, we aim to make the experience feel more human and grounded by incorporating richer user-generated content. This could include collecting a broader library of voice reflections, micro-reactions, and short memoji-style expressions from real users to provide more authentic social context.

Ultimately, the goal is to create a layered sensory environment where users not only see and hear others’ emotions but also subtly feel them, bridging the gap between digital isolation and shared human experience. This would bring the concept closer to a future where learning, decision-making, and other activities are enriched by the collective emotional intelligence of a community.

Safety

All shared emotional experiences in the system prioritize user consent and privacy. Any voice notes or reflections that Judy listens to in real-world interactions are provided voluntarily by other users. Emotional states are inferred only by the Neurochip sensors in real time; for privacy and ethical reasons, the system does not store users’ thoughts or experiences.

This ensures that while users can gain insight into collective emotional responses, their personal mental states remain private, and all shared content is intentional and safe.

Built With

Share this project:

Updates