Not everyone has the latest iPhone 12 with a heart rate sensor, but almost everyone has a webcam and microphone.
In a digital world affected by COVID-19, telemedicine is more necessary than ever. The pandemic has caused an influx in hospital cases, and the limitations on hospital beds have people wondering whether their symptoms are severe enough to warrant a doctor's appointment. Meanwhile, others experience ailments but are unable to afford a visit to the doctor due to a lack of or poor health care. We are in alarming need of free and practical telehealth for keeping in touch with our medical needs.
What is Checkup?
Checkup is an app that allows users to regularly check up on their physical and mental well-being without needing to visit a doctor.
We analyze your health using just two inputs - your audio and your video. We perform all of the following to get a complete read of your health.
- Video Blood Pulse Scan
- Video Facial Recognition
- Video Emotion Detection
- Audio-to-Text Sentiment Analysis
- Symptom Matching
A further explanation of how each of these works can be found in the Engineering section below.
For user privacy we also encode all data being sent over the internet and store absolutely zero percent of the data. We have no database - so users can feel 100% safe about using our product.
1. Go through onboarding for a guide on how to use Checkup.
2. Answer questions while we perform live analysis on your audio and video.
3. Take a look at our results! We provide Overall Health Score, Mental Health Score, Physical Health Score, Heart Rate, Emotion Analysis, and Illness Prognosis.
4. Get recommended a doctor near your area! We use your location and data results to find you the best medical expert for you.
UI / UX
Checkup was designed using the double diamond design process, a model popularized by the British Design Council. We divided our design process into four stages
In order to really focus on our users, we completed as many steps of a design process as we could over the course of a weekend. We conducted need-finding, created user personas, decided user flow, designed low fidelity and high fidelity prototypes. For more information, view the sub-sections below.
UI / UX Flow
Video Blood Pulse Scan
The Fourier Transform used to find a user's heart rate
Video Facial Recognition
Video Emotion Detection
Emotion detection was also doing using Face API. We used a custom open-source model that we found online to help us process user's emotions. We then extracted the user emotions from the API and connected them to timers on the frontend. This allowed us to have a hashtable that contains all of the data on how many minutes the user feels each emotion during the duration of the video recording.
Audio-to-Text Sentiment Analysis
We handled Audio-to-Text using a combination two methods. First off, we utilized a relatively new, experimental technology - Web Speech Apis. This allowed us to have real-time audio-to-text parsing which was useful for the Live Transcript. We then took this data and ran our own sentiment analysis algorithm on it to determine if a user's response to a question equated to a "yes" or a "no".
We matched symptoms using ApiMedic's Symptom Checker API. This was done in the backend using Python and Flask.
There were a lot of challenging components that went into building this project, and we made sure to take care of every single minor detail.
The map component was created through Mapbox GL. Multiple Three.js layers were overlayed on top to create using WebGL.
The backend was built using the Python framework Flask to make calls to various APIs and organize the data in a way that is presentable to the users. The backend also computes the health scores based on the symptom analysis, weighing "red flag" (or higher urgency) symptoms more heavily than others.
Our system architecture
Claude Julien, The enigma of Mayer waves: Facts and models, Cardiovascular Research, Volume 70, Issue 1, April 2006, Pages 12–21, https://doi.org/10.1016/j.cardiores.2005.11.008
van Gent, Paul & Farah, Haneen & Nes, Nicole & Arem, B.. (2018). Analysing Noisy Driver Physiology Real-Time Using Off-the-Shelf Sensors: Heart Rate Analysis Software from the Taking the Fast Lane Project.. 10.13140/RG.2.2.24895.56485.
Wang, Kuan & Luo, Jiebo. (2016). Detecting Visually Observable Disease Symptoms from Faces. EURASIP Journal on Bioinformatics and Systems Biology. 2016. 10.1186/s13637-016-0048-7.
What We Learned
This project was especially an achievement for us because it requires more individual web pages than our typical hackathon projects do, which meant we created many additional components on the front-end in order to have a completed project. We are also particularly proud that we were able to include a vast array of different features for the user, which we felt makes Checkup such a serviceable form of telemedicine. Lastly, we think the impact our project could have is a significant accomplishment. Especially in the digital age combined with remote day-to-day communications, this could really be a product that people find useful!
Checkup has so many useful features, so we'd love to further expand the project to handle direct 1:1 video calls with certified doctors! This way, our users would be able to quickly discover any symptoms they are experiencing without needing to schedule an appointment, and then get prompt assistance from a doctor to analyze and confirm any ailments. Overall, we hope that one day this project can be widely used among the medical community as a practical and accurate resource for patients and doctors alike.