Inspiration

In our current hospital space, 93% nurses surveyed in a study reported they thought hospitals were understaffed. That means without proper care and watch over specific patients with no means of accessing the helpline themselves (i.e those physically unable to move, unaccustomed to communication, or marginalized groups), critical signals that lead to life-or-death decisions may be missed. Observed by one of our own teammates, these patients’ only source of outlet may be vocal outbursts or facial expressions to convey their pain and need for help, which is not necessarily picked up by physical monitors.. Oftentimes, especially in absence of family members, their moans and cries echo unnoticed by nurses in understaffed hospitals – until it is too late

This is where ExpressCare comes in, a “call-button” technology that picks up these neglected cues to alert corresponding staff members, morphing hospitals into a more empathetic place.

What it does

Let’s consider a low-income, non-native english speaker that is admitted to an overstaffed hospital without family members. The language barriers alone make a big threat to communication, which is then amplified if the patient is physically constrained to the bed. ExpressCare monitors patients in these states and picks up distress queues to alert nurses if an overall distress index surpasses our threshold. ExpressCare “call-button” device then displays an alert with pain level index and duration on the nurses’ side, informing about a call for help.

Our team considered varying needs in hospital spaces, including patients that need psychological support over immediate medical attention. Therefore, an elderly patient admitted for dementia that is exhibiting queues of confusion as the primary emotion may trigger the system as well. Our device monitors this and sends a milder message to corresponding staff members, including hospital volunteers of psychologists that may better address these circumstances.

Nurses are able to monitor their patients during normal circumstances as well, as ExpressCare offers a display of levels of discomfort throughout the day.

How we built it

Express Care uses Hume API to configure facial patterns and vocal outbursts to pick up varying emotions. We used batch API and streaming API to process these configurations side-by-side, while isolating the specific negative emotions that resemble discomfort. A distress score is then calculated, which can include anger, pain, confusion, or others as underlying triggers.

Challenges we ran into

Getting the API to process audio and video was the primary challenge. We tried using PyAudio to store audio files and processing them in time intervals alongside video frames, and to get the technology to monitor vocal outbursts and facial expressions at the same time. However, we were unable to make it run on our local devices due to a PyAudio installation error, so decided to pare down the features we wanted to include in this iteration.

Accomplishments that we're proud of

This is an incredibly important need that we've identified in the healthcare space, and are proud of the contribution we've made to this area.

What we learned

We learned a lot throughout this project that will be useful to us in the future--both technically and market knowledge of AI, and APIs as a product.

What's next for ExpressCall

In the future, we would like to test this with real patients in a SF public hospital to measure the difference in length of distress period with and without ExpressCall, as well as to see if patients need anything else from this service.

Additionally, we would like to expand its use case to an iteration where we measure loneliness, sadness, and boredom for the long-term stay patients to see if they need psychological intervention.

Built With

Share this project:

Updates