We were inspired by the idea to give a brand new voice based interface to the existing awesome BlackRock Aladdin API from which the BlackRock's end users can benefit. Amazon’s Alexa gives a perfect platform to implement this in a state-of-the-art Speech recognition system using the robust cloud services offered by Amazon.

Users can perform some of the most useful operations they used to do through the BlackRock web service using this skill deployed on Amazon-Alexa, for example they can get the quotes of the stocks they own or they can get the top performing Funds of the day. They can also ask how the portfolios they own performed during the last year, month or day. So it gives a hands-free and a very convenient interface to interact with the amazing BlackRock APIs using natural language based commands.

We built it using the Aladdin APIs provided by BlackRock to get all the relevant data from BlackRock database. The backend app was developed in NodeJS and was deployed on Lambda server-less compute service with expandable computing server resources. The Alexa skill uses this Lambda function to process the user’s requests and provide with responses in the form of speech. All the configurations for Alexa were done in the Alexa’s Amazon Developer’s console for describing the various intents, slots and utterances.

The biggest challenge that we faced was the unavailability of a real echo device to test our service as the text to speech simulator was not enough to show the real usability of the application. Another challenge that we faced was the limited knowledge that we had of the BlackRock Aladdin’s API and how we could use it to solve the end-user’s problems and the inexperience with AWS Lambda and Node.js

We simplified the user interaction to get financial information from a complex data analysis to a simple question answer based fun exercise. The user just has to ask an abstract question in simple English to Aladdin on Alexa and it will churn the data to give a meaningful response to the user which is really simple to interpret. Also, it opens a whole new avenue to develop speech based apps for financial institutions like BlackRock to serve its end-users. Personally, learning to use a financial institution like BlackRock’s API to serve the users and come up with useful information was fun! We also learnt a whole new framework to develop skills for Alexa and picked up some essential skills like Node.JS and Amazon Lambda.

What's next? Maybe, Jasmine :P No :), we have thought of great use cases for Aladdin which can be centered around the end-users, like adding or dropping stocks from the portfolios or suggesting hot stocks for the users based on the market scenarios and also on the user profile. The user can ask for the most recommended suggestions from BlackRock to maximize his/her earnings.

Share this project:

Updates