Prior to attending HackGT, I had experimented with AWS Lambda in building a simple skill that would allow users to plan their shopping lists at local supermarkets through price look-up of the cheapest and most healthiest options through price lookup automated and optimized by the Amazon Alexa device. Therefore, coming into this competition, I knew that I would continue to grow my knowledge in this general area of skill development, but I did not know exactly the specific of the item that I would create. As such, when I saw the Amazon sponsor challenged being offered, it was like a dream come true for me: I not only was able to fulfill my passion and single-handedly construct a functioning Alexa skill from scratch in a very short duration of time, but I was also to deal with issues that relate directly to me and the entire Georgia Tech body on a personal scale: our daily lives and the hardships that we go through as a result of the academic and extracurricular demands of our lives.

In accordance with the prompt specified by Amazon, I designed my skill so that it focused on four tangible aspects commonly a part of the routine of a general Georgia Tech student: Their never-ending search for internships and job opportunities, their struggles of properly managing their sleep schedule and being well-rested for the days ahead, their issues in properly managing their finances and making the best decisions beneficial to them in the long run, and their initial worries about being able to navigate Tech's very broad camps (especially true for the majority of Freshman students). In order to create this quadri-pronged capability, I needed to learn how to build a dynamic intent schema in the Alexa developer console, as it would be possible for users of this skill to invoke it using more than just a single word. To incorporate these features, I thus used a single word - "Student Skill" - as a base entry into this functionality, which would prompt the user with more details that would allow Alexa to determine which Intent and Slot combination to invoke.

For the student career lookup skill, I used callback functions to the Muse API that presented job and internship opportunities in a very precise JSON format. I was able to parse through this data and split it into numerous categories based on recency of posting, location of job opportunity, and skill level needed for the position. Users are thus able to filter out numerous other options irrelevant to their search goals simply through specifying a few key detail in their communication to Alexa. Next, for the Sleep Management and Financial Advice skills, I used the speech to text feature of AWS Lambda that allowed the Alexa device to extract important details from the user, such as hours of sleep per week, annual budget, important goals, general rest feeling, etc. Based on these characteristics, specific recommendations were offered by Alexa to each individual viewer, based on standard medical and literary practices, that would allow them to optimize their well being to the maximum potential.

Finally, for the intent specific to Georgia Tech students, I extracted the GPS Coordinate locations of all the bus stops on Georgia Tech's campus, times of opening and closing of all of Tech's major campus dining halls and restaurants, and general happenings and events on campus with relation to data on the official calendar, twitter, and Reddit (Dependencies for Python Flask). When the user invokes this particular intent and asks for general advice, their location as present on their personal device is extracted to Alexa through the Google Maps Location Services API. The distance to all the bus stops is efficiently calculated using Python's numpy library, and the shortest distance is then outputted back to the user, along with the best dining option and campus event to go to based on the particular user's schedule and openings during the day.

It was quite a challenge for me to integrate both Python and Javascript dependencies into the same Alexa Skill. However, as I read through the documentation and asked questions at the Amazon help table, I was able to figure out how to manipulate the developer console and the lambda function so as to allow code from both of these languages to make up the backend logic. Overall, I had a really great time developing this Alexa Skill, and I am happy to report that it is working with a level of accuracy that extends far beyond my initial expectations. Please use this skill if you get a chance, and I hope you enjoy using it as much as I enjoyed making it!

Share this project:

Updates