Demo of our depth-perception program!
Azure Logic App Implementation (JSON code hidden)
Some fancy linear algebra equations implemented :)
The purpose for our project is the necessity of improving present senior care. As our older population grows, more attention is being drawn to senior independence and senior health. People are living longer than ever before, which also means health conditions that previously only affected a small portion of the population are becoming increasingly common. The leading cause of injury for senior citizens is falling. Tens of thousands of senior citizens die every year from fall-caused injuries, and 3 million are hospitalized. Existing technology like security cameras and LifeAlert proves inefficient, and motion-detecting wearable technology proves expensive. These facts became the basis for our project: the OmiCloud.
Passion & Solution
It is as senior citizens complete day-to-day activities that their most serious injuries occur. These falls can worsen previous injuries and can cause newer, more serious ones. Additionally, less than half of the falls suffered by elderly people are reported. Moved by these statistics, and the stories we shared with each other about our grandparents, our team was driven to create a solution. We decided to create the OmiCloud: an innovative technology that analyzes point-cloud data using the paper, Fast Sampling Plane Filtering [Biswas, 2012], and utilizes the cloud-platform Azure, to revolutionize senior health.
How we built it
We used Microsoft Kinect hardware as a depth sensor to input x, y, z -coordinate datasets as point clouds using the OpenKinect library (open source robotics code). We used Ubuntu and the little-known Processing IDE to implement the mathematical transformations from "Fast Sampling Plane Filtering", a robotics article published by Carnegie Mellon in 2011. We found the planes of best fit with Eigenvalue Decomposition matrixes to calibrate the floor. We used a set of transformations of the point clouds to determine any humans in the room. We then performed a series of calculations and created a set of equations that defined a fall. Once a fall is determined, the program sends a signal to both Microsoft Azure’s Logic Apps and Twilio to connect with the senior’s relative or caretaker.
We chose to take a more complicated and sophisticated approach to this problem using point clouds in order to make our data safer during detection. Kinect has "skeleton detection" APIs already built in. We could have chosen to locate a "shoulder" and when that shoulder remained near the ground plain for a period of time, a fall could be determined. HOWEVER, the skeleton detection might fail to miss someone with amputations or if the shoulder was obscured by a piece of furniture. Our code is safe and fair as it works no matter the person's disability and operates locally on the system before sending the signal to the network. By performing the bulk of our computations locally, before sending any signals to a network (because we're using robotics) we provide the ability to scale easily and affordability.
As a team, and as individuals working on our delegated tasks, we experienced lots of roadblocks. Issues ranged from proper installment of software, to learning new languages like Processing and JSON, to implementing high-level mathematics and point-cloud data analysis. While we were successful in networking all of these components and platforms together, we were not able to accomplish full detection of a fallen senior: a lack of time hindered us from implementing all APIs and methods in our Processing-based program.
The end product of these struggles led us to navigate and understand Microsoft's cloud Platform Azure in the time given, working extensively with sensor hardware, and explored multiple endpoint calls to azure and twilio from a java file. We are proud that we have over 200 lines of working code in the language Processing.
The Future for OmiCloud
With the new Kinect v3 coming out, we can implement Omi Cloud to work specifically with Azure for a smoother process and transition through platforms and further utilize location services to give the most accurate emergency services. Along with that, a reply option upon replying “HELP” or “OKAY”, is either pointed to emergency phone number for further assistance or nothing. In addition, we as a team would love to develop our own detection device, called the OmiCloud and a launching a full deployment of the OmiCloud Company. Our business model would be to start with a few customers (likely systems for assisted care homes) as we ensure our scalability and improve our UI. We would then negotiate agreements with life insurance companies to reach full scalability and gain access to an even broader market.