Inspiration
Caregiving relationships often rely on verbal communication, yet many care recipients struggle to clearly express how they feel — whether due to emotional fatigue, cognitive load, stress, or simply not wanting to burden their caregiver.
We were inspired by the idea that well-being is multi-dimensional: how someone says they feel, how they appear emotionally, and how their body responds physiologically can all tell different parts of the same story. CareConnect was built to bridge this gap by giving caregivers a more complete, respectful, and human-centered understanding of the people they care for.
What it does
CareConnect is a web-based platform that enables multimodal emotional and health tracking between a care recipient and a caregiver.
Recipients can express how they feel through:
- simple emoji-based self-reporting,
- real-time facial emotion recognition via webcam, and
- heart rate monitoring (currently simulated, designed for Fitbit integration).
Caregivers receive this information automatically through a calendar-style dashboard, where emotional check-ins appear alongside scheduled tasks. This allows caregivers to quickly identify emotional patterns, potential stress periods, and moments where additional support may be needed — without requiring constant verbal check-ins.
How we built it
We built CareConnect as a full-stack web application:
- Frontend: Vanilla HTML, CSS, and JavaScript for clarity, accessibility, and low cognitive load.
- Backend: Flask (Python) providing RESTful APIs for emotions, tasks, and physiological data.
- Computer Vision: OpenCV for face detection and DeepFace with TensorFlow/Keras for facial emotion recognition.
- Architecture: A modular design separating emotion analysis, physiological monitoring, and calendar/task management.
- Design: Privacy-first, minimalist UI with clear visual distinctions between emotion sources.
The system integrates multiple input modalities into a single caregiver-facing timeline, emphasizing interpretability over raw data.
Challenges we ran into
- Environment compatibility: Many ML libraries are sensitive to Python versions and system dependencies, which made deployment non-trivial.
- Balancing privacy and insight: Designing facial emotion recognition that provides useful signals while ensuring no images are stored or transmitted.
- UX simplicity: Making the recipient interface intuitive enough to use during emotional distress without overwhelming the user.
Accomplishments that we're proud of
- Successfully integrating three different emotional/physiological modalities into a single coherent system.
- Designing a caregiver calendar interface that translates complex data into actionable insight.
- Implementing real-time facial emotion recognition with confidence scoring while maintaining a privacy-first approach.
- Building an end-to-end prototype that demonstrates both technical depth and real-world caregiving relevance.
- Clearly separating recipient and caregiver experiences to respect autonomy and dignity.
What we learned
- Multimodal systems are as much a design challenge as a technical one — how data is presented matters more than how much data is collected.
- ML-powered features must be engineered with deployment constraints in mind from day one.
- In caregiving contexts, trust, consent, and simplicity are just as important as accuracy.
- Sometimes the most impactful innovation comes from connecting existing technologies rather than inventing entirely new models.
What's next for CareConnect
- Replace simulated physiological data with full Fitbit API integration and real-time syncing.
- Add persistent storage, authentication, and support for multiple caregiver–recipient pairs.
- Introduce analytics to detect emotional trends and correlations over time.
- Explore on-device or browser-based emotion inference to improve responsiveness.
- Expand accessibility features and adapt the system for elder care, mental health support, and assisted living contexts.
- Develop AI-assisted caregiver insights to suggest proactive, personalised interventions.
Log in or sign up for Devpost to join the conversation.