Table Number

358

Inspiration

As international students, many of us on the TheraAI team have traveled thousands of miles to build a future for ourselves. In the process, we often become so busy that we can't spend as much time with our parents, who miss us deeply as they grow older. This personal experience inspired us to create a solution using what we do best: coding. We wanted to help people like our parents, who need constant companionship and emotional support.

What it does

TheraAI enables users to interact with an AI companion that uses advanced facial and audio recognition technologies to assess their emotions and respond accordingly, improving their mood over time.

How we built it

On the frontend, we used React, Streamlit, and Figma for webpage creation and data visualization. On the backend, Flask handled API requests, OpenCV provided visual inputs for the HUME API to identify emotions, and Pandas and NumPy managed data on the Streamlit dashboard.

Challenges we ran into

We faced challenges in integrating audio and visual inputs simultaneously, dynamically updating the dashboard with real-time data, and pipelining data between Streamlit and React.

Accomplishments that we're proud of

Two of our team members participated in their first hackathon, where they worked on the front end and rapidly picked up React and Figma to produce the best user experience possible. We were also proud to have used OpenCV in conjunction with HUME to detect emotions in real time. Finally, despite the fact that none of the members of our team had previously met, we worked incredibly well together.

What we learned

Some team members learned React whilst others learned Streamlit for the first time. We also learned how to smoothly integrate OpenCV, the HUME API, and backend-frontend connections.

What's next for TheraAI - Personal Therapist

Our next steps include addressing the challenges we faced and deploying TheraAI to the public to support the emotional well-being of elderly individuals.

Share this project:

Updates