Currently, the process to get medical information from your healthcare provider is intimidating and can be quite a pain for many users. Hoping to solve this problem, we decided to create an app that would make this process easy and natural while also providing additional comprehensive information on personal health.
What it does
hospital.ity "makes the hospital hospitable" – it enables people from all walks of life to retrieve information from their healthcare providers and inform themselves about their personal health with ease. Users can access everything from blood oxygen levels to prescriptions to medical encounter history. Furthermore, with its chatbot interface, hospital.ity is extremely user-friendly, allowing people to find the information they want with a simple question.
How we built it
We built the app using Objective-C, queried the Wolfram Alpha and Human API for personal wellness/medical data via REST API calls, and trained the natural language processing (NLP) model using wit.ai. Using image recognition algorithms, the app also provides nutritional information for food in a photograph.
Challenges we ran into
We actually changed our focus halfway through the hackathon from food/nutrition to overall health after discovering the Human API. However, as we were unfamiliar with the API, which retrieves medical information from users, we initially struggled with authentication in accessing user data. We also had some difficulty with handling the dozens of possible queries in wit.ai – each potential question required a large number of NLP training cases (each sometimes requiring a large amount of time to learn) for it to work properly in response to conversational speech. And of course, it was quite the challenge to parse API data and produce the proper response on screen.
Accomplishments that we're proud of
Making the process of finding health information friendly and extremely accessible for the average person. On the technical side, developing a robust intent identification with NLP and developing algorithms to parse the data for the chatbot.
What we learned
We learned how to train an NLP model via semantic analysis and how to connect it to multiple APIs to create a functioning chatbot.
What's next for hospital.ity
We didn't end up developing for all the queries trained in wit.ai due to time constraints, so there's definitely more to add there. In addition to implementing more possible queries and strengthening the NLP model, hospital.ity could extend to allowing the user to perform actions such as setting up appointments. This would ease users' interactions with their healthcare providers even more, furthering our goal of making personal health important and understandable for all.