Inspiration

ReMind was inspired by the growing need for compassionate and accessible tools that support cognitive wellness across all stages of life. Memory loss and cognitive challenges affect older adults, individuals recovering from head trauma, people with disabilities, and communities that are often underserved by traditional health technology.

Dementia and cognitive decline disproportionately affect people of colour, yet many existing solutions are clinical, intimidating, or inaccessible. We wanted to create something that feels human and inclusive, focusing on memory, identity, and emotional connection rather than diagnosis.

ReMind treats cognitive wellness as a lifelong journey and aims to support users with dignity, empathy, and accessibility at its core.

What it does

ReMind is an accessible, AI-powered companion app that helps people preserve meaningful memories and strengthen cognitive wellness.

Users can:

  • Capture memories using voice, text, or photos
  • Complete daily brain exercises designed to improve recall and focus
  • Perform gentle cognitive check-ins that track trends over time
  • Receive personalized prompts and summaries
  • Share progress with trusted caregivers or family members if desired

ReMind supports older adults, individuals with disabilities or head trauma, young adults training their memory, and anyone interested in long-term brain health.

How we built it

ReMind was designed as an iOS-first application that integrates deeply with the Apple ecosystem.

The core experience is built for iPhone and iPad, with planned Apple Watch integration for lightweight daily check-ins, reminders, and cognitive prompts. This enables short, low-effort interactions throughout the day.

We integrated with Apple Health to support cognitive wellness tracking alongside other health metrics. All health data is opt-in and focused on awareness rather than diagnosis.

ReMind uses the iOS notification system to deliver gentle reminders that encourage consistency without pressure. Accessibility is supported through native Apple features such as VoiceOver compatibility, dynamic text sizing, haptic feedback, and high-contrast design support.

AI personalization is powered by the Gemini API, which generates reflective prompts, summarizes memory themes, and adapts cognitive check-ins over time. ElevenLabs enables natural, calming voice interactions for memory capture and guidance.

Challenges we ran into

One major challenge was balancing accessibility with meaningful cognitive engagement. Activities needed to be stimulating without causing anxiety or frustration.

We also had to integrate AI carefully so insights felt supportive rather than intrusive. Memory data is deeply personal, so transparency and user control were essential.

Designing for users across different ages, cultures, and cognitive abilities required multiple iterations to ensure inclusivity without oversimplifying the experience.

Accomplishments that we're proud of

  • Built an accessibility-first cognitive wellness app aligned with Disabilities @ Microsoft: Best Accessibility Hack
  • Created an inclusive experience that addresses disparities highlighted in Best Minority Hack
  • Implemented voice-first interaction to reduce barriers
  • Integrated AI in a calm, human-centered way
  • Designed a scalable foundation for long-term cognitive trend tracking

What we learned

We learned that accessibility is about empathy as much as interface design. Cognitive tools must consider how users feel, not just how they perform.

Voice-based interaction significantly lowers barriers for older adults and individuals recovering from neurological injuries. We also learned that AI is most effective when it quietly supports users rather than taking control.

Inclusive design leads to better experiences for everyone.

What's next for ReMind

Next, we plan to:

  • Expand Apple Watch functionality for passive and lightweight interactions
  • Introduce multilingual voice support for greater cultural inclusivity
  • Add adaptive difficulty for cognitive exercises
  • Provide deeper long-term cognitive trend insights
  • Enable secure, opt-in data sharing with clinicians or caregivers

We also plan to track cognitive trends over time using a simple aggregated score:

Inline example: \( C_t = \frac{1}{n} \sum_{i=1}^{n} s_i \)

Display example:

$$ C_t = \frac{1}{n} \sum_{i=1}^{n} s_i $$

Where \( C_t \) is the average cognitive score at time \( t \), \( n \) is the number of tasks in the check-in, and \( s_i \) is each task score. This helps track long-term changes without diagnosing users.

Built With

Share this project:

Updates