What it does

PulseAI is a web-based computer vision app which reads children's art for the emergence of emotional cues like anger, fear, sadness, or happiness. Through image classification, it shows strong emotions and lets the adult intervene appropriately when a child signals a need for support emotionally.

How we built it

We used a Convolutional Neural Network (CNN) which had been trained with a database of children's drawings which were labeled by emotions. The app includes:

Python

For the model, TensorFlow/Keras

Streamlit for the user interface

OpenCV for feature extraction from images

Challenges we faced

Collection and quality of the data set

Verifying correct emotion labeling

Making the app user-friendly for non-technical users

Image preprocessing and model compatibility management

Achievements

Trained an image classification model successfully

Deployed a working Streamlit app with an intuitive UI

Integrated smart interpretation of results for parents

What's next

Incorporate multilinguality

Provide a chat-based parent assistant (powered by LLMs)

Create PDF reports with interpretation and recommended next steps

Built With

Share this project:

Updates