- Arjo Chakravarty - 490 ## Inspiration For some time it has been possible to detect pulse using minute fluctuations in color of the forehead. A few projects using this technology have been built at MIT CSAIL . However, AFAIK no-one to date has built a working copy of their algorithms for the phone. I decided to develop an app that performs the same task. Given time constraints, the algorithm used is much simpler, however it can still give pulse rates.
What it does
This app uses a phone's camera to detect the pulse of multiple people from a distance. This could potentially be used in gyms and exercise halls to monitor user's heart rate. However, for the purpose of the app, the heart rate is displayed directly overhead. NOTE: THIS IS EXTREMELY ALPHA QUALITY BUILT IN LESS THAN 24HRS (more like 16HRS since I tried to do something stupid with 74hc ICs but burned too many of them). DO NOT USE FOR MEDICAL DIAGNOSTICS
How I built it
The app is built using Java on the Android platform. Firebase's ML was used for detecting and tracking faces.
Challenges I ran into
Although google claims real time performance of Firebase ML on its phones, I found that in practice the amount of time it takes for Firebase ML to run is at the least 400ms. While this is realtime enough for simple camera applications, pulse detection requires a much higher rate. Given that a user's pulse rate can reach 180bpm the minimum sampling rate has to be 6 frames per second due to nyquist's theorem. Thus I had to create an elaborate system that runs the detector slower on a seperate thread. At the same time I needed to tag users in real time so there were two threads running detection. One for giving a rough face area to render the text and another that actually calulates the pulse. Pulse was also hard to pick up. I had to use a bandpass filter and a very lousy peak counting algorithm. Given more time I would use a FFT based detector to increase accuracy of the pulse.
Accomplishments that I'm proud of
I am proud of the architecture that I had to come up with to create a perceived real-time effect. I had to develop an interesting pipeline in order to produce something which is perceived as real-time.
What I learned
I learned a fair deal about parallel programming and signal processing. I also learned to use google's firebase ML API.
What's next for AB - FacePulse
There is a scaling issue currently due to time constraints which I was unable to fix. At the same time, I think a move towards the algorithms described in the link to mit csail might significantly improve the performance of the detector.