Inspiration

Improving patient experience with personalized, preventative care as well as the overall hospital experience will require a faster network. Telehealth and AI imaging software and mobile applications can bring people their utmost experience all along their health and wellbeing journey. Bodyhealth uses deep learning algorithms and AI processing running at the edge of a mobile network to give relevant guidance for tele-consultations with medical experts from remote locations.

Lags in AI processing, patient vital monitoring and tele-consultations are not only frustrating for those using it, but the poor quality can delay patient care, which could hurt outcomes in the long run. As the use of Artificial intelligence (AI), Video calling, and Internet of Things (IoT) technologies continue to grow, the amount of data on networks is expected to only increase more.

5G technologies combined with Multi-Access Edge Computing (MEC) have the potential to help resolve these challenges.

What it does

Bodyhealth enables doctors and medical experts to have tele-consultations with a patient. Everyone does not have to be physically present in the room with a patient, so the use of Bodyhealth for telepresence. In addition, radiology operations infused with machine learning can be done with software for quick result. For example, Bodyhealth provides a machine learning model and solution for detecting covid-19 in X-rays. Besides, patients in a medical facility can be monitored using wearables, IoT and mobile devices for vital data insights and improved communication between providers, their patients and electronic medical records allowing physicians to care for their patients unlike ever before.

Patient data is shared with providers in real-time through Bluetooth-enabled devices and AI processing, allowing for quick access and review during telehealth calls.

How we built it

My goal is to build an app that will take advantage of the high-bandwidth and low latency provided by a 5G and MEC network. This enabled me to narrow the use cases down to one of the industries that need the application and solution the most. In medical emergencies, every second counts. That further led to conducting a research on what will transform healthcare experience for the providers and patients as well as the relevant technologies. I used native development for the android companion app and created a tizen wearable app on a samsung galaxy watch that provide data to the android app acting as a gateway for sending data to the Bodyhealth platform hosted in an EC2 instance in the the Atlanta wavelength zone. The Bodyhealth platform and solutions consist of a Flask/python/tensorflow app for performing inference on patient chest images it receives, a frontend React app, and a backend NodeJS signaling server for connecting doctors and medical experts. The backend NodeJS server is also used for authentication and user interface development. The flask app with the model for covid-19 detection used transfer learning and limited datasets of covid-19 patient chest x-ray images for both covid and non-covid images. In another approach, I used a large datasets for training the model using a jupyter notebook. The transfer learning approach performed better in terms of accuracy of prediction.

I used WebRTC to for video call and telehealth part of the solution. WebRTC is an open source project to enable realtime communication of audio, video and data in Web and native apps. I tried to avoid using Twilio, Nexmo or other Video SDKs because the locations of the servers are not known. Also, a wavelength zone being a private environment might not support remote connectivity to other clouds. In addition, developing a WebRTC solution that runs locally on the edge will help with low latency and high bandwidth issues that may arise when connecting to other networks.

The mobile and wearable apps for monitoring patient vitals were created using html, css, and Android java.

Accomplishments

I am able to create the design and turn the product of research into a working prototype that can be extended further.

What we learned

I learnt a lot about Verizon 5G and AWS networking from wavelength to the virtual private cloud. I have also learnt a lot about technologies like WebRTC, wearable and mobile development.

What's next for Bodyhealth

I would like to expand the research and development further by integrating cameras, robots and augmented reality glasses with the solution for operations in a surgical room. Also, I am planning to add virtual rooms and clinics using virtual reality.

Share this project:

Updates