Many pets die every year because of injuries we don’t know about. Pets can't talk about their injuries, so they can often go unnoticed. How do we provide proactive care?

Our solution is PetApps, a system to quickly diagnose and simplify domestic animal care. The mobile app and wearable hardware companion tracks the motion of your pet to determine if it is injured or in distress. The app then guides you through the process of getting your pet the care it deserves.

Frontend / Mobile App

The mobile application serves to alert the pet owner when their pet is in need of their attention. The app receives up to date information via Google Cloud Messaging. A personalized push notification is displayed when a limp is detected.

The app guides the user through the next stages of animal care by allowing the user to find vet hospitals nearby and then request an uber, call the vet hospital, or navigate via Google maps directly from the app.

Users can stay engaged with the app through persistent notifications with when navigating to the vet hospital for easy interaction with the app.

The app also runs on Android Wear which receives mirrored data from the phone, displays it, and allows the user to take actions.

Backend / Machine Learning

Our machine learning algorithm starts when the user first takes a step after putting on the device. Accelerometer data from every step is transmitted from the Thalmic Myo over Bluetooth to a nodejs server, which transmits the data over UDP to a compute server written in Python. These accelerations are then resampled into 64 sample blocks using Secret Rabbit Code (libsamplerate) for step length invariance. Then, a fast fourier transform is applied to the samples. These fourier transformed step samples are analyzed by a pre-trained Scikit-Learn logistic regressor and classified as either limping steps or normal steps.

In order to visualize the function of the machine learning algorithm, principal component analysis (PCA) was used to reduce the 64-dimensional features to two dimensions which can be easily plotted. The normal walking (blue) and limping (red) data are shown as two distinct clusters with the steps from the normal walking gif shown in pale green. For these images and a deeper dive into the data, head to our imgur gallery @ http://imgur.com/gallery/1FGKC.

If a certain percentage of limping steps are detected, a message is sent over Google Cloud Messaging to the Android app, which alerts the pet owner to the state of their pet. The owner can alternatively be alerted via SMS on any phone using Twilio. We chose the Intel Edison to run the machine learning algorithm as the platform has an appropriate processing power and is small enough to eventually be integrated into a dog collar.

Contact Us

If you would like any more information about the project or would like to get in contact, you can reach us by email at admin@maxbareiss.com or on twitter at @chrisfreder (Christopher Frederickson) or @handnf (Nick Felker).

Share this project:
×

Updates