PainSight
Detecting pain when patients cannot communicate
Inspiration
Pain is one of the most important signals the human body produces. It tells us when something is wrong, when the body is under stress, and when urgent care may be needed.
However, millions of patients every year are unable to communicate pain clearly. In hospitals, this includes:
- unconscious trauma patients
- sedated ICU patients
- pediatric patients
- patients facing language barriers or neurological impairments
In these situations, doctors must rely on indirect signals such as heart rate, blood pressure, or delayed imaging. This can make it difficult to detect hidden injuries quickly or prioritize care when time is critical.
This led us to ask:
How might we help doctors detect and understand pain in patients who cannot communicate it?
That question inspired PainSight.
What it does
PainSight is a speculative neural pain detection system that allows doctors to detect, visualize, and prioritize pain signals in patients who cannot communicate.
Using a neural interface similar to an EEG device, PainSight interprets brain activity associated with pain perception and translates it into a visual body map that highlights probable pain locations and severity.
Key features
Patient Queue Dashboard
- Prioritize patients by urgency and scan need.
Patient Profile
- View vitals, medical history, and communication status.
Neural Pain Scan
- Analyze neural signals associated with pain perception.
Pain Visualization
- Map detected pain signals onto a body visualization.
AI Clinical Report
- Generate a report summarizing pain severity, risks, and possible actions.
PainSight gives clinicians a new sensory capability:
the ability to see pain when patients cannot express it.
How we built it
PainSight was designed using Figma and Figma Make, where we built an interactive prototype simulating a real hospital workflow.
Our prototype includes:
- a patient triage dashboard
- patient profile views
- a neural scan interface
- a pain visualization screen
- an AI-generated clinical report
To make the concept more tangible, we also 3D printed a mock EEG-style neural interface device that represents how neural signals could be captured in a future clinical setting.
Challenges we ran into
One of the biggest challenges was translating a complex medical concept into a clear and intuitive interface.
Pain detection is inherently subjective, and neural signals are extremely complex. We needed to design a system that simplified this information for clinicians without overwhelming them.
Another challenge was visualizing pain in a meaningful way. We explored several approaches before designing a body visualization with color-coded severity levels that can be quickly interpreted in a triage environment.
We also focused on designing safeguards to ensure that PainSight supports medical professionals rather than replacing clinical judgment.
Accomplishments that we're proud of
We are proud of designing a concept that transforms an invisible human experience into something clinicians can interpret and act upon.
Our team created a full workflow demonstrating how PainSight could fit into hospital triage systems — from patient intake to neural scanning and clinical reporting.
We also built a working interactive prototype and created a physical EEG prop to bring the concept to life during our demo.
What we learned
This project pushed us to explore how design can translate complex biological signals into meaningful insights.
We learned the importance of designing for clarity, prioritization, and decision-making, especially in high-pressure environments like hospitals.
We also explored how design can introduce new sensory capabilities, enabling people to perceive signals that were previously invisible.
PainSight shows how thoughtful interfaces can bridge the gap between advanced technology and real-world human needs.
What's next for PainSight
In the future, PainSight could evolve into a comprehensive pain monitoring platform integrated with hospital systems.
Possible next steps include:
- continuous neural pain monitoring for ICU patients
- integration with electronic health record systems
- improved neural signal interpretation models
- tracking pain changes over time
- remote monitoring for patients recovering at home
Ultimately, we imagine a future where pain is no longer something doctors must guess — but something they can clearly perceive and respond to.
Built With
- blender
- figma
- onshape
Log in or sign up for Devpost to join the conversation.