Inspiration

Currently, hospital call systems for nurses consist of a wired contraption, inconvenient for patients to use, especially when they lack full function of their body. In emergencies, it would be difficult-to-impossible for a debilitated patient to signal for help on their own. As a automated system, the data could then also be collected and be put to good use.

What it does

In the case of need the patient is able to contact their caretaker, whether a family member or a nurse, to come assist them. Over the duration of the care, the caretaker is able to track and view information about their patients to help assess conditions and seek improvements.

How I built it

We used the Amazon Alexa Skills Kit to detect voice commands, which are processed on AWS and interacts with our Django backend. Client data is stored in a database and served on an API, which is called upon by both our analytic website and mobile app.

Challenges I ran into

Overall, learning the Amazon Alexa APIs was one of the harder parts of the project because of our unfamiliarity of its documentation.

Accomplishments that I'm proud of

We were proud of creating a whole platform for our service, taking into account all user roles such as administration, patients, caretakers, etc... We are able to provide a fully featured product as a result.

What I learned

One of the most critical things we learned was the process of designing functionality for a very specific audience. We were able to brainstorm and consider what features would be best for our app.

What's next for Ear-Out

We aim to ultimately provide a full system for patients under care. The voice commands could be extended to provide more options for patients, and the service could be tailored to specific facilities, with detailed maps and visualisations.

Share this project:
×

Updates