Inspiration
During medical appointments, the doctor or nurse will take notes as the patient talks about their problems, but there is no record of the conversation apart from whatever notes they may have written. However, things may be missed or forgotten by both parties–especially if there are barriers to communication like a disability or language incompatibility.
What it does
Health Chat records conversations between medical professionals and their patients and provides further information using an AI that analyzes what we gather from the conversation. Doctors, nurses, and patients all have different displays in the app based on how much access they require. Doctors, for example, can see a patient's symptoms and possible diagnoses (with detailed complex information) derived from those symptoms combined with the patient's age and gender. Patients, on the other hand, get more simplified information that is easy for them to understand.
How we built it
We created an iOS app and made it so that users have to register first with their role (doctor/nurse/patient) and personal information if they are a patient (age/gender). We store all data in Firebase, a real time database, so that conversations will appear live on all devices of users involved. Using the Houndify API, we turn speech to text and then analyze it to determine what is a "symptom." We then pass symptoms into the Priaid's symptom checker API to get more information on each and also come up with possible diagnoses for the patient. With Microsoft cognitive services, we were able to collect sentiment over time of the patient's speech to track how they are feeling.
Challenges we ran into
Natural language processing, particularly distinguishing the intent of a certain part of a conversation–whether it is a question or not–was a bit difficult.
Accomplishments that we're proud of
The app is designed to be fully accessible, so patients with disabilites can easily use it to communicate with their doctors/nurses. Screen reader and dynamic font size for the elderly or visually impaired, as well as dictation for people with physical disabilities who can't type and keyboard input for those who can't speak.
What's next for Health Chat
Next, we hope to work on encrypting the data and giving patients control over what data can be stored to give them more privacy. We will also add in translation in real time, so for example if the patient speaks Spanish, the dictation wiill record it and display it in English so an English-speaking doctor can understand them almost instantaneously. And in exchange, information the patient receives about their diagnosis will be in Spanish so they know how to proceed.
Built With
- firebase
- google-cloud
- houndify
- microsoft-cognitive-services
- swift
Log in or sign up for Devpost to join the conversation.