Inspiration

We decided we wanted to try to run our own computer vision. After playing around with a few ideas, we realised that our software would be good for literally tracking feet. We then had a brainstorm on the things that foot tracking could be useful; this led us to the concept of analysing behaviour for application such as retail store optimisation, by finding areas with the highest density of footfall

What it does

Using opencv on raspberry pis, footfall detects live movement against a background. This movement is analyzed to detect different bodies. Assuming a good vantage point, the bottom of each body is the feet, which are always in the same plane, give a flat floor. The image skewed to produce a top-down view of the floor. The foot position data, along with time stamps and a persistent ID for each set of feet is uploaded to a firebase instance.

Then a beaker notebook is used to fetch the data again and provide various analyses upon it.

Accomplishments that we're proud of

Real-time remote image tracking, done on a raspberry pi.

What we learned

  • How to detect features in and manipulate webcam feeds.

  • How to use beaker notebook and firebase.

What's next for Footfall

  • Add more analysis tools to the beaker notebook section of the project.
  • Allow our raspberry pi system to detect more things and increase the scope of the project: multiple rooms, cameras, objects such as cars, the ability to track interactions by proximity in a conference setting.

Built With

Share this project:
×

Updates