Inspiration
I've been working on the Peaberry iOS app for about a year, and (even though it's a massive distraction) we couldn't resist from building an Echo and Echo Show companion skill.
What it does
Calculates your water and coffee quantities, shows you brewing guides, gives step by step instructions and more
How I built it
I built it using the amazing Ask CLI and Alexa Node.js SDK on Lambda. I have a REST api sitting on Heroku that I had partially built for a mobile app, and extended for this Alexa skill.
Challenges I ran into
Designing for voice is a completely different challenge from designing for a screen. Building time delays into the skill was a challenge, and I'm not 100% satisfied with my current solution (playing an audio track). I think in the future I'll probably try to do an informational video or audio track, or perhaps a notification system.
Deploying and testing the skill has historically been a pain until the Ask CLI was released.
Accomplishments that I'm proud of
Working smoothly between voice-only and screen (Echo Show/Dot) devices, as well as some of the conversational bits left me feeling fairly accomplished.
What I learned
I learned a lot about the way that Alexa conversations work, switching between intents, maintaining state, etc.
What's next for Peaberry Assistant
More brewing tutorials, improved conversation skills with more awareness of the current state.
Built With
- amazon-alexa
- lambda
- node.js
Log in or sign up for Devpost to join the conversation.