Inspiration
Our team member, Anton, works at a retirement home - but for residents to get the attention of staff, they use a button on their wristband. That means that healthcare aides have no idea whether they're rushing over to bring a glass of milk, or to save a life. We set out to change that - because precious seconds can change lives.
What it does
Anson allows assisted living residents to use Google Assistant to contact staff - after all, voice can be a lot more friendly than a complicated touch interface. Anson also uses Azure's Cognitive Services to assess the urgency of a message, and presents staff with a clean interface to view resident messages, sorted by priority.
How we built it
- We used React on the front end, which subscribes to a Google Cloud Firestore database of messages.
- We used Actions for Google and Dialogflow to build a Anson, a Google Assistant Action that relays messages to staff.
- We used Azure's Cognitive Services API to analyze the urgency of messages.
- We used a Node.js REST API to tie it all together.
Challenges we ran into
- We came up with this idea at 1:00 AM after spending 12 hours on an idea that went nowhere.
- We built this in seven hours after a frustrating start to the hackathon.
Accomplishments that we're proud of
- Our team's pivot and success in still building a working project.
What we learned
- Unity is hard, AR makes it harder.
- Real-life maps in AR are even harder, especially with poor documentation.
- You can't actually build an iOS app if you don't have a paid developer account.
- Google's documentation could be better.
What's next for Anson.ai
- Support for Amazon Alexa
- Integration with wearables
Built With
- actions
- ai
- azure
- dialogflow
- firebase
- javascript
- node.js
- react

Log in or sign up for Devpost to join the conversation.