I worked on this project during my postdoctoral studies at USC-ICT.
Ellie is a virtual human designed for healthcare support. She engages a user in a face-to-face interaction and reacts to the perceived user state and intent, through her own speech and gestures.
Multisense automatically tracks and analyzes in real-time facial expressions, body posture, acoustic features, linguistic patterns and higher-level behavior descriptors (e.g. attention, fidgeting). From these signals and behaviors, indicators of psychological distress are inferred to inform directly the healthcare provider or the virtual human.
I was in charge of generating Ellie's nonverbal behavior (gesture, head nods, gaze patterns,..) so she conveys trust and empathy.
Built With
- cerebella
- natural-language-processing
- python
Log in or sign up for Devpost to join the conversation.