The increasingly widespread use of voice-based interaction inspired us to create an Amazon Alexa skill that provides preliminary medical diagnoses. Clair, our Alexa skill, not only enables individuals to be better informed about their health, but also provides the opportunity for doctors and nurses to spend more time engaging with patients (rather than slowly typing in symptoms). We also see Clair as an accessibility-enabled product, where users with accessibility needs can more easily gain medical information and communicate with healthcare professionals, friends, and family about their health.
What it does
Clair uses a "20 questions" interaction format to ask users whether they are experiencing a particular symptom. Based on the user's input (yes / no response), Clair uses a binary search algorithm to intelligently decide what symptom to ask about next -- ultimately providing a preliminary medical diagnosis of the condition. In the companion Alexa app, informational "cards" are displayed about the user's symptoms which could be viewed by doctors / nurses in a medical setting.
How we built it
The Alexa skill is built through the Alexa Skill Kit. We linked the skill to AWS Lambda, which is written in Python. We generated our dataset using a list of medical conditions and common symptoms.
Challenges we ran into
We originally planned to use the Muse brain-sensing headband and analyze brain waves to determine a "yes" or "no" response to Alexa's question and eliminate the need to reply out loud. However, it proved impossible to integrate the Muse headband with Alexa in the 36 hours that we had for hacking. So, we decided to focus on verbal interactions with Alexa and discontinued the Muse side of the project.
We were unable to find a thorough and usable dataset of medical conditions and a list of their symptoms. As a result, we generated our own dataset by randomly pairing anywhere from 1-20 symptoms with our curated list of 3,400+ medical conditions.
Accomplishments that we're proud of
The Clair Alexa skill is fully functional and opens the door to a broad range of uses and the continued development of interactive, smart health. And, even though it didn't make it into the final product, we figured out how to operate the Muse headband, view live signal graphs, and looked into building a classification model.
What we learned
We learned that connecting Muse and Alexa is really hard! We also found out that Alexa already plays "20 Questions," our original idea. However, we're excited about the health-focused skill that we created instead!
What's next for Clair
As the availability of data related to medical conditions and their symptoms improves, we would like to populate the skill with better data to provide for increasingly accurate diagnoses. We also want to explore the opportunity to send the preliminary Clair diagnosis directly (and securely) to medical providers. Finally, as a user logs more time with Clair, the diagnoses should become more personalized for the user's health conditions and history.