Inspiration

Problem

In the US, 80% of seniors suffer from at least one chronic condition. Seniors suffering from chronic illnesses required regular health check-in with their caregivers, however caregivers are unable to reach them due to the current pandemic. The vulnerable population including seniors are advised to self-isolated at home during the COVID-19 outbreak. They may feel lonely and isolated from their friends and family, leading to potential mental health struggles.

Solution

The idea is to connect seniors who are suffering from chronic illnesses to caregivers by offering a two-way health management platform. The platform collects accurate health data collection via wearable devices, AI chatbot, facial and speech recognition, analyses data using trained machine learning algorithms and presents the data in the form of data analytics dashboard to caregivers. So health monitoring for elderly will be easy, simple, effective and more importantly trouble-free.

Market research

During our initial market research, we discovered that the market is saturated with remote medical monitoring devices and care apps. Most of the apps are tailored to the general public and most of these devices functioned automatically, focusing solely on vitals. Some devices used geo tracking and others used AI to monitor motion and analyze device readings. But very few used any type of facial recognition software. We saw this opportunity and delved deeper into its possibilities.

Knowing already that we wanted to our target user to be elderly persons with chronic diseases, we shifted part of our focus towards how we could connect the two. Currently, medical facial recognition software is being used to map/ track patient, detect diseases during early childhood and read pain, specifically in babies.

For the purposes of our app, the facial recognition would primarily be used to map and track patients. But the plan is to establish a firm basis in facial recognition technology so that we will be on the forefront of the 'early disease recognition'. Modern medical knowledge allows for doctors to diagnose many issues based on things like moles, conjunctiva color, and skin color. It's only a matter of time until AI technology catches up.

The impact of our app will not only be felt in the elderly population of America, but also to all lower/ middle class families around the world that either can't afford healthcare or live in remote location where travel is time consuming.

But our app is not just about facial recognition and remote medical care, it's also about checking in with the patient on an emotional level. Research shows that depression is a common amongst the elderly. So we made sure to prompt our patients with a simple "Are you feeling down?" and follow it up with an option to talk to someone. This will in turn automatically notify the medical personal and prioritise contact with the individual. Research show's that a simple "Are you okay? Do you want to talk?" can be very impactful on both physical and mental health.

So although our MVP is currently focused around quarantined elderly persons with chronic illnesses, the potential beneficial impact that our app could have can easily stem globally to all classes, ages, and cultures.

What it does

We have built a proof-of-concept prototype with two interfaces - a specially designed interface with enlarged text and buttons for seniors, and another interface for caregivers to easily visualised the health conditions of their care recipients.

Senior (patient) interface:

Login

Report feelings

Report any identified symptoms

Take photo for analysis

Caregiver interface:

Login

Dashboard with patients' conditions & alerts for follow up

Detailed patient record & contact details

How we built it

We built the Progressive Web App using the .NET framework.

In order to implement the idea, it was necessary to create a robust back end to support information flow between the patient and the caregiver. Data structures were designed such that any caregivers have a list of patients with information such as their symptoms, medications, mood, and overall medical status. The most exciting part of the App to develop was the connection to the mood recognition API. This API allows the caregiver to view the patient's mood, weather, or not the patient is in pain, and even if the patient is tired. All of this information is made available to the front end through a Face object, which stores all of the mood information without the need to store the image and therefore avoiding any ethical concerns.

In order to leave room for growth. We ensured that all of the data and functions are accessible through a few simple function calls so that the App is implementable on many different platforms.

Challenges we ran into

We faced some challenges in front-end development using html and bootstrap, and only managed to develop 1.5 screens out of 11 main screens.

Accomplishments that we're proud of

We have completed the UI design, the proof-of-concept MVP and the whole back-end development for the app, which is fully functional.

What we learned

We learnt a lot on UX/UI design and mockup using Figma, video editing using screencastify and iMovie, calling APIs and html coding.

What's next for Tele-Love

Development of the progressive web app

  1. Front-end development of both caregiver and senior interface

  2. Development of chatbot to streamline user flow and allow voice-based interactions

  3. User Testing

  4. Training of machine learning models for facial recognition

  5. Develop features to allow synchronisation to wearable devices (e.g. smart watch) for health data collection

  6. Research and develop offline interfaces and data storage for persons living with limited internet access

Built With

Share this project:

Updates