Voice is the first language we learn to interact with the world. Why can't we use our voice to control our finance.

What it does

We build an voice command application Money Penny on Google Home to control your finance. Get your personal recommendation and use your voice like a fingerprint get protect your valuable finance data. Money Penny also learn your personality to recommand a financing plan suited on your needs.

How we built it

We build the application using Google Home and Google API.AI. For authentification we use Microsoft Cognition Services with Azure and ran a python backend. The frontend is build on jQuery with HTML and CSS:

Challenges we ran into

The biggest challenge is to build a intuitive user experience. As voice as UI is relatively new, a lot of user testing have been conducted to make our voice application intuitive and easy to use.

Accomplishments that we're proud of

To show the vision of banking 3.0. By levering voice we show that voice can be used for several use cases to gain a better experience. Also by using the Azure Cognition Services we can learn the personality of the user of recommend the right financing solution.

What we learned

Voice application needs a lot of planing and user testing to get it right. As we can only rely on the sound, we need to think differently to design a voice application.

What's next for Money Penny

The next thing would be to connect money penny to a real transaction system (e.g. Twint) to transfer real money with your voice.

Built With

Share this project: