Inspiration

Mental and physical health is important to all of us. Problems with either can severely impact people's quality of life. Maintaining good or excellent health can offset disease as we age.

Fortunately today there are powerful resources to assist us: AI, data platforms and educational resources. MediMind combines all three to provide a useful health assistant.

What it does

MediMind utilizes AI, Infermedica's API and YouTube to present an assistant that can both listen to, diagnose, assist and inspire people, and also recommend relevant YouTube videos.

The AI can connect to Infermedica's diagnosis API to walk them through an interview process to assess problems.

Recommended videos range from music that inspires or calms, to educational clips. The videos are recommended during chats and are user-driven.

How we built it

MediMind was built with:

  • TypeScript (programming language)
  • Next.js (web framework)
  • PostgreSQL (database + vector store)

APIs:

  • GPT-5-mini and GPT-5-nano LLMs.
  • Infermedica's Engine APIs.
  • YouTube Data API v3 (video search).

Open Source libraries (which I wrote):

These libraries aren't yet NPMs and were deployed within to the MediMind repo.

Challenges we ran into

I only found out about the hackathon on Wednesday night (20th Aug), so time was limited on an already tight deadline.

Accomplishments that we're proud of

  • Building a useful AI assistant that seamlessly integrates with both Infermedica and YouTube APIs.
  • Completing the project on time for the deadline.

What we learned

  • APIs can greatly augment what AI can offer users.

What's next for MediMind

  • I'd like to release the software to the general public to assess feedback.

Built With

Share this project:

Updates