It started as a test to see if I could use Express to access and serve APIs using browser location. Then it became my first attempt at an Alexa skill. Now I've added a visual component using APL to complement the voice response.
What it does
Tells you where you can buy a gyro
How I built it
The skill is currently live without a visual component, relying on Lambda-based Node. Because of issues I encountered with several failed re-certification attempts (and to experiment with the newly-available Ruby Lambda offering) I re-wrote the Lambda functionality in Ruby, adding APL functionality.
Challenges I ran into
- Alexa can struggle to correctly interpret my wake phrase ("I need a gyro"), specifically because of issues with hearing "gyro" as "euro". This appears to have been a signifiant issue with certification attempts this time around.
- APL took me some time to grok/manipulate. I wish I had access to a more intuitive API to work with APL.
Accomplishments that I'm proud of
Quickly developing a set of Ruby tools to handle Alexa interactions from a Ruby Lambda deployment.
What I learned
What's next for I Need A Gyro
- Extracting the device- and APL-specific logic into a gem to ease future skill development
- Prompt users for address if they haven't granted device address permission to the skill