Introduction

Recently, the need for safe healthcare has skyrocketed, and many hospitals are at or near capacity. Nurses have to monitor several patients at once and in person, which is a time-consuming and possibly unsafe activity. A nurse spends on average over 5 minutes to check the vitals of a single patient, and a nurse could be asked to care for 8 or more patients at a time. This is almost a full hour of nothing but checking vitals! This process can be optimized, and this is why we have created Insights.MD.

Project Features & Technologies

data flow diagram

Insights.MD is a portable, wearable device that automatically collects key vital data from the patient, including heart rate, blood oxygen level, and body temperature, allowing nurses and doctors to accurately keep track of several patients all at the same time, saving precious time and resources, as well as decreasing the risk of nurses and doctors being exposed to potentially deadly viruses and bacteria.

All of the data is collected through an Arduino attached to a glove the patient can wear while at the clinic or hospital, which is also connected to a raspberry-pi server that then sends the data to our backend for identification and visualization.

If the device is disconnected or lifted from the patient’s finger, we cease data transmission to avoid collecting and sending false data.

Feature Breakdown

  • Complete hardware sensor suite to stream real-time biometric data to the Insights.MD backend. Hardware connectivity built with NodeJS and SerialPort. The sensor suite is connected to an Arduino. Data gathered from the Arduino is sent to a wifi-enabled Raspberry Pi to send to the Insights.MD backend.
  • Camera-based facial emotion analysis complete with our own Emotional Index Score to monitor patient mental state and help optimize nurse - patient checkups. API built with Google Cloud Functions, Python, and Google Vision AI.
  • Keyword analysis of a transcription of doctor - patient interactions, extracting important terms and entities to help doctors effectively and accurately take notes. The extracted keywords are automatically entered into the patient's profile for easy referencing and anonymized data exporting for public APIs. API built with Google Cloud Functions, Python, and Google Natural Language.
  • Audio recordings and facial recognition images are discarded immediately after processing, and are not persisted in any database. This is to ensure no excess user data is stored unethically.
  • Patient records are stored in Google Firestore, accessible only to authorized hospital staff.

Technologies Used

High-throughput Databases

Our team was originally going to utilize Google Cloud Bigtable for an append-only database format to help efficiently manage time-series data points. Specialized databases like Google Bigtable are fantastic for storing tremendous amounts of data rapidly (i.e. pushing onto the database the user's biometric scans on a second-by-second basis). While we managed to get code up for it, sadly it was too cost prohibitive to keep online for the duration of this hackathon, and we resorted to Google Firestore's free tier to keep costs at $0. If we didn't have GCP credit utilization concerns, we would stick with Bigtable for biometric logging and use Firestore exclusively for modifiable records like a patient's name and ailments.

Having Google Bigtable on our side would have also allowed us to more efficiently export and analyze large batches of data necessary for ML training, and would be the best industry-focused database that we would go with.

Emotional Index Score (EIS)

Emotion Data Graph

Emotion Data Equation

X-Axis represents emotional net score generated from all raw emotions. Y-Axis represents the scaled Emotional Index Score. Figure plotted with Wolfram Alpha, Equation written in LaTeX.

The four core emotions generated from Google Vision AI facial analysis are weighted on a scale from 1 to 5. A score of 1 represents an unlikely classification. A score of 5 represents a very likely classification. For example, an image with a score of "Joy": 5, "Sorrow": 1 represents a very joyful person who is very unlikely to be displaying any sorrow.

I grouped these four weights into two categories: positive and negative emotions. Positive emotions represent joy, and negative emotions represents sorrow, anger, and surprise. From there, I devised an algorithm that treats positive emotions as having a positive net score and negative emotions that have a negative net score. The emotional index score represents these net scores wrapped around a sigmoid function to scale the output between 0.0 and 1.0 for classification.

By scaling these emotions, a nurse is able to see at a glance the emotional state of any patient under their care, and prioritize patients exhibiting strong negative emotions for a more empathetic and caring healthcare system.

What's next for Insights.MD

  • Reimplement Bigtable connection (see Append-only Databases section)

Built With

Share this project:

Updates