Inspiration

We put our heads together to think about how we could harness the rising power of AI language models into our application. Daniel's dad had actually worked on a mobile pharmaceutical app a decade prior, which we decided was prime material to integrate AI into because ChatGPT itself cannot prescribe medicine or provide official medical advice.

What it does

MedTalk provides users a simple platform to get the medical answers they need. Our friendly chatbot can answer any medical inquiry that the user might have, and to account for the possibility of error it graciously provides cross-references to some of its lengthier answers. For further information on treatment options, there is a separate page where the user can access a database containing drug information on a variety of treatments.

How we built it

The project boils down to a user interface and an AI language endpoint. The UI was primarily designed with Sveltekit and styling done by Tailwind, and we combined various components with a light-blue color scheme to design a sleek UI. The Go API processes user input and questions, before using a language model, that we trained ourselves with public medical data with the help of cohere to generate insightful responses. The search feature is implemented strictly within front-end capacities via a scraped shortlist of common drugs and side effects and a simple script to serve them.

Challenges we ran into

From the very beginning, our journey was riddled with obstacles. Coming up with an idea itself took several hours of thinking, and committing to our MedTalk idea was a leap of faith. To train our AI language model, searching for several pools of data in public records was not simple to do, and we ran into an issue where we wanted to use the DrugBank database only to discover that their public csv file was locked behind an application. We got around this via manual scraping and querying drugs.com. In terms of backend, we had issues where we were getting ratelimited, but cohere worked with us to navigate around this problem for our API. The front-end development process was relatively smooth thankfully, and while we had some initial issues overcoming CORS policies and attaching our backend endpoint to our website we were able to pull through in the end.

Accomplishments that we're proud of

  • Training a custom AI language model based on cohere's framework in <24 hours
  • Scouring custom data specified for medical and pharmaceutical concerns
  • Designing a sleek UI and UX, highlighting simplicity and key to design
  • Programming the chatbot interface and message scrolling animations
  • Attaching a homegrown backend to an attractive frontend to complete the product

What we learned

Before we walked in to Aspiration Dome's doors yesterday, Ryan had never done a hackathon before, and Daniel & Ryan were together while Mia was solo, so the experience was figuring things out on the fly from the very beginning. We formed a team while looking for a seat, and after hours of brainstorming we took the leap of faith and began designing the application structure. We learned how to coordinate between frontend and backend parallel development processes, along with the importance of taking calculated risk and communicating with one another. We strengthened our understanding of UI and sveltekit, and polished our styling skills in Tailwind. We scoured the internet for data and understood the importance of free information, and worked to develop our language processing endpoint. We brought these two ideas together to form a formidable final product, and we had a ton of fun along the way!

What's next for MedTalk

MedTalk is theoretically a complete proof-of-concept, but in the future, we would like to see it:

  • A more robust data set, from DrugBank for example, to improve our language model and search capacities
  • Proper establishment of our language model on a full-time server
  • User security when interacting with the chatbot.

Built With

Share this project:

Updates