I thought about what a sales manager looks for and it boiled down to revenue. Seeing what leads are open, getting their value, and knowing what the month will look like are critical business operations. I was also inspired to make something visually appealing so it gets the point across in a way that words do not. Since the premise is that the user should stay in the Cisco Spark platform, I wanted to make it easy for users to access their Pardot information either by voice or by typing a quick command.
What it does
The user authenticates with Cisco Spark and Pardot. They are then able to get a monthly forecast of the estimate value of open leads, see the top 5 open leads, and know the probability of those leads closing within the month.
As an added benefit, the user can also have Alexa read the messages aloud, create new rooms, and add new team members, all hands-free.
The information is also shown as an infographic with real-time data from Salesforce Pardot.
There are two prerequisites:
- You must have a Cisco Spark account.
- You must have a Pardot account.
Go to https://emilytlam.com/pardot.html to authenticate your Cisco Spark account and Pardot information.
You can say type the following commands: /opportunities to get your top 5 leads for this month based on the probability of those leads /forecast to get the total amount of leads and sum of the value for this month.
Or you can type Tell me about my opportunities and What's my forecast?
Then view the infographic to see the pie chart and bar graph that are dynamically rendered.
Easter egg: type yay!
To use with Alexa:
- You must link your account with the Sal the Bot Skill in the Alexa app.
You can say the following: Alexa.... "list my rooms." "read my messages." "create a new room." "get my teams." "create a new team."
If you get lost at any time, you can say "menu", "help", or "stop".
How I built it
I used an oauth flow to authenticate the user with Cisco Spark and Pardot. Their email and access tokens are stored in a DynamoDB database. API Gateway is used to receive POST messages from Cisco Spark. The webhook is registered to an AWS lambda function that is exposed through API Gateway. This lambda function handles requests from Cisco Spark when new messages are created. The function fetches the user's information from DynamoDB, makes the necessary calls to the Pardot API, and sends a POST message to Cisco Spark using html markdown. From there, the user can click on a link that renders the infographic of the information.
Challenges I ran into
I would have liked to render the svg file directly into the Cisco Spark platform but it was not one of the file types supported by the API.
I also had difficulty setting up a bot but once I understood the concept of webhooks it clicked for me. Then I wasn't entirely sure I needed a bot after all because of the limitations (the bot has to be in the chat room, has to be mentioned, etc). The concept of an integration, application, or a bot was differently blurred for me as I traversed between Alexa and my integration that turned into an application but acted like a bot. As a result, It was difficult debugging access tokens because in some cases I wasn't sure if it was Pardot, Cisco, API Gateway, or Lambda, or if an access token had simply expired.
Accomplishments that I'm proud of
I'm proud of building an application with many moving parts and calling multiple APIs to not only process information but to render it in a visually pleasing manner. I also focused on a vision and implemented a solution around that concept rather than cobbling together what I could based on what I knew. I wanted my application to be practical, but also visual. And I'm happy to say that when you type in yay!, you will a smiley face. It was to remind me that in the end, the effort is worth it because I learned to become a better developer.
What I learned
I learned a great amount about webhooks, setting up an API to handle POST requests from Cisco Spark via API Gateway, debugging authentication errors (Pardot tokens are only valid for one hour...) and how to use DynamoDB to read and write table entries. Using CloudWatch to debug log messages was crucial. I also learned to deploy lambda functions with Apex which was a huge time saver.
What's next for Salazar the Spark Bot
*Making the design flow looking and feel more similar to the Cisco Spark authentication flow. *Having Salazar make Pardot requests using Alexa. I would also like to incorporate more natural language processing with the Salazar application, especially adding more commands and intents. I also have a Watson sentiment analysis for the Alexa skill that is in development mode.