Inspiration
Any skill presented with Google home mini comes with a wow factor. Our focus was to have an interaction with the device that can be light-hearted and uplifting.
What it does
Uplifts mood when you feelin down and saves you from boredom by reading posts The skill has 3 functions (4th one is a asu easter egg)
- If you feeling down or sad It reads out motivational/cheering quotes
- If you feel bored The Doctor comes up with posts from the reddit subreddit TIL, to keep you entertained
- If you feel happy then you should probably stop annoying the doc.
- If you feel the feeling of #1 thing that ASU has well :)
How we built it
We used python scripts to get the data from r/TIL and had stored it in a JSON format. Meanwhile the other devs were busy, using javascript and Dialogflow to plan the interaction flow of the skill Dr Feel Good. This meant spending a lot of time on DialogFlow, DialogFlow google tutorials and StackOverflow. Once the data was stored it was to be cleaned simply via Linux sed and other commands. This data was then hardcoded for the option when the user feels bored. Data for sad emotion was just emotional quotes found on google again hard coded via the tool dialogflow. The interaction with the assistant is of the pattern Talk to assistant -> Welcome Default -> Select emotion -> Do the response for specific action
Challenges we ran into
The orignal plan was to have the skill get the data via api from reddit so it can post new stuff everyday. The integration of the api throught the online editor available in Dialogflow seemed very hard and to no avail the idea was changed to hardcode the data. For when the user feels sad the orignal plan was to play the song "Happy" by Pharrell Williams. Embedding the audio again proved very hard to do considering our first time usage of the device mini home/developing google assistant skill.
Accomplishments that we're proud of
Somehow managed to get the data (hardcoded) on the skill. Faced some inital challenge while setting up the device but mentor Ben Wison from Google helped us out. Understanding the tool Dialogflow and its interaction and terminologies was no easy feat. Got it deployed and working under an hour.
What we learned
No matter how better you are at adapting, there are some things that cannot be accomplished under a time limit. Have to pick the battles you can win. Deployment turned out much easier and faster than expected.
What's next for Google Home Mini Dr Feel Good assistant
Adding more emotions (currently 4) scope for - hungry, envy, anxiety, etc. Embedding audio for certain emotions to play songs, and definitely the usage of api from the back-end to generate latest posts from different subreddits.
Built With
- dialogflow
- google-home-mini
- intent
- javascript
- r/til

Log in or sign up for Devpost to join the conversation.