In our day-to-day busy lives, we can sometimes lose sight of the little things that bring us joy. will remind you. Through your Amazon Echo or Google Assistant, you can keep tabs with an intelligent AI that catches trends in your mood, and suggests some basic but effective things to do when you are feeling down.

What it does

The user checks in with once a day. The check-in can be in the form of a verbal diary entry or just a sentence or two about your day. If the AI detects increasingly negative emotions and feelings over the course of a number of consecutive days, it suggests a small self-care activity to the user.

How we built it

The sentiment analysis aspect of this project was created using IBM Watson's Tone Analyzer API. We connected this AI-enabled API to Voiceflow and using their platform and our custom javascript code we arrived at our final product. Once the Tone Analyzer API returns a particular emotion, it is logged in our database. If the user has three consecutive negative conversations with, we suggest self-care activity at random from our database.

Challenges we ran into

The biggest hurdles to this project came at the very beginning. The IBM Tone Analyzer required authentication. Voiceflow, however, does not have an authentication field in their API Block, so we had to find a workaround. We also fixated on some minor implementation issues for far too long, which were better off abandoned from the get-go.

Accomplishments that we're proud of

Before starting this project we knew nothing about APIs, let alone the Voiceflow API Block and the Tone Analyzer API. Now, we are proud to announce that we know a little bit more than nothing about APIs. We are also proud to have gained a deeper understanding of voice assistants, and how they can be designed.

What We learned

We learned the general fundamentals of APIs, got comfortable with some specific skills required to use Voiceflow and the Tone Analyzer and we also learned that it is important to come up with an idea and a plan B well before the start of the hacking period. We also got more familiar with the design process of a more multi-faceted software product.

What's next for

In the future, could grow to become more intelligent and better connected to other APIs. For example, when you are feeling down it could suggest playing your favorite song by connecting to your Spotify. Alternatively, it could suggest ordering your favorite meal by connecting to, say, UberEats. These features would require the development of an application, where the user has the ability to customize their self-care suggestions. The responses to negative emotions could also be refined. The Watson Tone Analyzer API can identify more complex emotions, such as fear, anger, and sadness, which could be utilized to generate more mood appropriate responses.

Built With

Share this project: