Dasha uses ML models where it listens, understands and answers just like humans do. This is how we got the idea of using Dasha AI at the time of mental health crisis when humans may or may not be available around them.

What it does

Dasha has a call feature in which it asks you questions and understands your emotions by the way you speak. It helps you take the stress off by talking to you and assuring that you are feeling better.

How we built it

We made an app called DashiMotion which detects your emotions and makes Dasha call you if you are stressed or sad. Here's the part where Dasha becomes your emotional support friend BayMax and stays , listens to you till you feel good. We added custom intents which are helpful for having friendly conversations between the user and BayMax. We also added digressions to keep the conversation smooth and flowing to make it more human like interaction. Understanding human emotions are complex but with BayMax powered by Dasha AI you'll never feel that no one understands you. Remember there's always someone who listens to you (may or may not be human but still your friend).

Challenges we ran into

To use the text feature of Dasha in frontend

Accomplishments that we're proud of

Finally finished the project in given time

What we learned

How to use Dasha AI call features

What's next for Dashimotion

Implementing text features of Dasha in the app. Adding text and call features of Dasha directly from the web app to simplify the process and make it easy for the users of the app.

Built With

Share this project: