Not everyone has the latest iPhone 12 with a heart rate sensor, but almost everyone has a webcam and microphone.

In a digital world affected by COVID-19, telemedicine is more necessary than ever. The pandemic has caused an influx in hospital cases, and the limitations on hospital beds have people wondering whether their symptoms are severe enough to warrant a doctor's appointment. Meanwhile, others experience ailments but are unable to afford a visit to the doctor due to a lack of or poor health care. We are in alarming need of free and practical telehealth for keeping in touch with our medical needs.

Checkup logo

What is Checkup?

Checkup is an app that allows users to regularly check up on their physical and mental well-being without needing to visit a doctor.

We analyze your health using just two inputs - your audio and your video. We perform all of the following to get a complete read of your health.

  • Video Blood Pulse Scan
  • Video Facial Recognition
  • Video Emotion Detection
  • Audio-to-Text Sentiment Analysis
  • Symptom Matching

A further explanation of how each of these works can be found in the Engineering section below.

For user privacy we also encode all data being sent over the internet and store absolutely zero percent of the data. We have no database - so users can feel 100% safe about using our product.


1. Go through onboarding for a guide on how to use Checkup. Step 1

2. Answer questions while we perform live analysis on your audio and video. Step 2

3. Take a look at our results! We provide Overall Health Score, Mental Health Score, Physical Health Score, Heart Rate, Emotion Analysis, and Illness Prognosis. Step 3

4. Get recommended a doctor near your area! We use your location and data results to find you the best medical expert for you. Step 4

Creation Process


Checkup was designed using the double diamond design process, a model popularized by the British Design Council. We divided our design process into four stages

  • Discover
  • Define
  • Develop
  • Deliver

In order to really focus on our users, we completed as many steps of a design process as we could over the course of a weekend. We conducted need-finding, created user personas, decided user flow, designed low fidelity and high fidelity prototypes. For more information, view the sub-sections below.

User Personas

User Personas

UI / UX Flow

User flow

Low-Fidelity Prototypes

Low-Fidelity Prototypes of all pages

Visual Design

Visual Design

High-Fidelity Prototypes

High-Fidelity Prototypes of all pages


System Architecture

Data Analysis

Video Blood Pulse Scan

We get the user's heart rate by taking advantage of Mayer waves - oscillations of arterial pressure that occurs in conscious subjects. Using these, we determine your heart rate by monitoring the tiny fluctuations in the color of the forehead. This is done by taking the average pixel values of the forehead region and performing a Fourier Transform to convert this signal to a sum of frequencies, the most prominent of which will correspond to the user's heart rate. Additionally, this is all done on the frontend side to be tremendously fast - to our knowledge we are the first individuals to implement this on the web using JavaScript.

Fourier Transform The Fourier Transform used to find a user's heart rate

Video Facial Recognition

Facial Recognition is done using Face API, a JavaScript library that utilizes TensorFlow to do face recognition on the browser. We inputted a custom model to generate the information we desired, extracted the information, normalized the bounding box data, and inputted it into an element. This canvas element was directly laid on top of a video that received a direct webcam feed to give the appearance of bounding boxes following the user's face.

Video Emotion Detection

Emotion detection was also doing using Face API. We used a custom open-source model that we found online to help us process user's emotions. We then extracted the user emotions from the API and connected them to timers on the frontend. This allowed us to have a hashtable that contains all of the data on how many minutes the user feels each emotion during the duration of the video recording.

Audio-to-Text Sentiment Analysis

We handled Audio-to-Text using a combination two methods. First off, we utilized a relatively new, experimental technology - Web Speech Apis. This allowed us to have real-time audio-to-text parsing which was useful for the Live Transcript. We then took this data and ran our own sentiment analysis algorithm on it to determine if a user's response to a question equated to a "yes" or a "no".

Symptom Matching

We matched symptoms using ApiMedic's Symptom Checker API. This was done in the backend using Python and Flask.


The frontend was built from a wide variety of web technologies. To start, we used React, SASS, JavaScript, HTML, and CSS to form the core of our frontend stack. Ant Design was used as a component library to help us quickly create progress circles, bar graphs, carousels, and buttons.

There were a lot of challenging components that went into building this project, and we made sure to take care of every single minor detail.

The map component was created through Mapbox GL. Multiple Three.js layers were overlayed on top to create using WebGL.

Tech  stack Tech stack


The backend was built using the Python framework Flask to make calls to various APIs and organize the data in a way that is presentable to the users. The backend also computes the health scores based on the symptom analysis, weighing "red flag" (or higher urgency) symptoms more heavily than others.

Tech stack Our system architecture


Claude Julien, The enigma of Mayer waves: Facts and models, Cardiovascular Research, Volume 70, Issue 1, April 2006, Pages 12–21,

van Gent, Paul & Farah, Haneen & Nes, Nicole & Arem, B.. (2018). Analysing Noisy Driver Physiology Real-Time Using Off-the-Shelf Sensors: Heart Rate Analysis Software from the Taking the Fast Lane Project.. 10.13140/RG.2.2.24895.56485.

Wang, Kuan & Luo, Jiebo. (2016). Detecting Visually Observable Disease Symptoms from Faces. EURASIP Journal on Bioinformatics and Systems Biology. 2016. 10.1186/s13637-016-0048-7.


What We Learned

Checkup was an ambitious project for us, because it involves a great number of features and required doing medical research beforehand to ensure the accuracy of our results. The initial research paid off, though, as we found that building upon existing symptom-detecting groundwork allows us to bring that capability straight to the user. Additionally, we learned how to integrate more advanced engineering knowledge into a web application through our implementation of a Fourier Transform using JavaScript. All in all, this project was sufficiently challenging for us because it pushed us to use our research and mathematical proficiency in a way that is functional for the average user.


This project was especially an achievement for us because it requires more individual web pages than our typical hackathon projects do, which meant we created many additional components on the front-end in order to have a completed project. We are also particularly proud that we were able to include a vast array of different features for the user, which we felt makes Checkup such a serviceable form of telemedicine. Lastly, we think the impact our project could have is a significant accomplishment. Especially in the digital age combined with remote day-to-day communications, this could really be a product that people find useful!

What's Next

Checkup has so many useful features, so we'd love to further expand the project to handle direct 1:1 video calls with certified doctors! This way, our users would be able to quickly discover any symptoms they are experiencing without needing to schedule an appointment, and then get prompt assistance from a doctor to analyze and confirm any ailments. Overall, we hope that one day this project can be widely used among the medical community as a practical and accurate resource for patients and doctors alike.

+ 36 more
Share this project: