We wanted to bring a strong Recipe search on voice interface as mostly all Recipe skills on Alexa had to start with Ingredients which is not the right way we think about cooking.
What it does
It searches websites for recipes on all major cooking sites. Once a search matches the user’s query then the website is parsed dynamically at runtime and voice content is created on the fly along with any image or video source (If available to display on devices with screens). This gives us access to almost 5 million recipes on the web and you can request Alexa to read out any of these recipes.
How we built it
We have used bing search api to search keyword on targeted websites and created a parsing logic for each website in java. This way we are able create content dynamically. We are able to parse all text and media contents (images and videos) dynamically. We rank searches of every user (not linked to a user) within elasticsearch to provide alternative display options for each user, whilst taking into consideration most popular searches as well as last search items. We use php and mysql for all backend operations.
Challenges we ran into
The biggest challenge was getting the skill approved. Once a few issues were resolved, then it was full steam ahead. It's the first time we used elasticsearch as an AI engine. We primarily use it for our analytics. With this skill elasticsearch is no longer a passive system but it participates actively in providing content for the skills.
What we learned
We learnt more about APL and the general dos and dont's - which was great. We are also trying to improve our video making process, so that we can use it to promote our skills in future more easily.
What's next for Cook Buddy
We plan to add more and more websites as we go along which has content suited to this skill, this will mean there will be more recipes available to the users. We will also look into user feedback, correct any bugs and make sure its in a good working order.