It started as a test to see if I could use Express to access and serve APIs using browser location. Then it became my first attempt at an Alexa skill. Now I've added a visual component using APL to complement the voice response.

What it does

Tells you where you can buy a gyro

How I built it

The skill is currently live without a visual component, relying on Lambda-based Node. Because of issues I encountered with several failed re-certification attempts (and to experiment with the newly-available Ruby Lambda offering) I re-wrote the Lambda functionality in Ruby, adding APL functionality.

Challenges I ran into

  1. Couldn't find the robust (official) Alexa-specific SDK libraries for Ruby as I have used for JavaScript, so I had to create a lot of request/response functionality myself.
  2. Alexa can struggle to correctly interpret my wake phrase ("I need a gyro"), specifically because of issues with hearing "gyro" as "euro". This appears to have been a signifiant issue with certification attempts this time around.
  3. APL took me some time to grok/manipulate. I wish I had access to a more intuitive API to work with APL.

Accomplishments that I'm proud of

Quickly developing a set of Ruby tools to handle Alexa interactions from a Ruby Lambda deployment.

What I learned

I really appreciate the Alexa-specific sdk tools Amazon offers in JavaScript, and hope they offer the same soon for Ruby.

What's next for I Need A Gyro

  1. Extracting the device- and APL-specific logic into a gem to ease future skill development
  2. Prompt users for address if they haven't granted device address permission to the skill

Built With

Share this project: