Inspiration

We were inspired to create Alexa Sage after researching the uses of a voice user interface (VUI) and seeing how it could apply to medical contexts. One social problem relevant for one of our team members was elderly care and loneliness, as their grandparents recently were facing health issues. We identified that there was a need for more scalable forms of health promotion and cognitive decline monitoring among seniors. 15% of Canadians over 65 are in institutional care, and depression rates while in care are 40%, which is much higher than the general population of seniors.

Further, VUIs can be more intuitive for people not used to a smartphone, and have the potential for more organic communication between people and their devices, such as seniors who may be more wary of technology. We also had an interest in wellness and positive psychology, and the medical field, and combined these with our skills in backend programming, data science, and psychology to tackle Alexa Skills.

What it does

Alexa Sage has two primary components: Emotional resilience building (3 gratitudes exercise), and cognitive acuity monitoring (sentence repetition and analysis). Both exercises are empirically backed by research, the first two promoting long-term happiness and resilience against depression, and the second being a standard cognitive test administered as a dementia measure.

Users are prompted on a periodic basis to engage in an informal discussion with Alexa, where they are asked to make voice entries in a gratitude journal, the results of which are given to an R API to analyze sentiment and dictate Alexa's responses, as well as storing happy memories for later replay. During this process, users are also given a brief test of their mental faculties - confirming their name, and checking if they can remember a brief sentence and repeat it back with proper pronunciation.

The results of the gratitude journal analysis and cognitive acuity tests are then stored in a google spreadsheet, where long term trends or abrupt shifts in emotional effect are identified via an R API and used to notify a user's primary caregiver through the Twilio API. Should users have particularly negative responses when asked to recount something they're grateful for, Alexa can also offer to call their loved ones for them.

How we built it

The user interface and the bulk of our program's structure comes from Voiceflow, a visual coding program for building Alexa skills that can integrate with external APIs. We built R scripts to analyze text inputs on one computer and used the Plumber package to set up an R API to send character strings back and forth from a google spreadsheets data storage location. We then used the numerical output (sentiment level based on the R Syuzhet package) from R to store the data and compare to a baseline of user sentiment and an absolute level (if very negative), and make Alexa offer users the option to call loved ones, and should they accept the Twilio API will call or send a text to their loved one's phones.

Challenges we ran into

This was the first hackathon for three of our four members, so understanding the norms of the event was a big hurdle, and only one of us had much coding experience.

Some highlights of the challenges we faced:

  • Translating psychological concepts, tests, and exercises into a VUI
  • Coordinating voice flow with a mono database, pivoting to a spreadsheet and R data analysis
  • Integrating Twilio and R APIs with voiceflow
  • Getting voice flow to export to the physical Alexa device
  • The time constraints of the competition
  • Narrowing our focus to what is feasible

What we're proud of & what we learned

Charvi learned to use voice flow and more about data science and R Anthony is proud we were able to integrate psychological theory with a social impact focus into a cohesive app Yang learned to use the Twilio API Ethan learned how to create an R API

All of us are proud of what we made, and happy that we learned so much in the process.

What's next for Alexa Sage: Promoting cognitive & mental health for seniors

We want to add additional tests and exercises to Alexa, to better build emotional resiliency and monitor cognitive health of seniors. Some possible ones are:

  • more cognitive games such as drawing and memory recall
  • suggesting different activities based on the user mood and time of day, such as reading a book, calling a friend, or going for a walk
  • integrating with a user's medical exercises and medication, to prompt them to do these

As well, the Alexa Prize is currently underway in the US to develop conversational capabilities with Alexa. During these conversations, we could conduct cognitive assessment and promote positive psychology habits as well.

Link: https://docs.google.com/presentation/d/1GrK3_w8w3Fr8feRcucNr2EoXvo0pl_BCWkYn2zdZhGg/edit?usp=sharing

Share this project:

Updates