Footfall Tracker was developed to address the growing need for intelligent space utilisation and real-time occupancy monitoring in modern environments. Inspired by inefficiencies observed in public spaces such as retail stores, transport hubs, and campuses, the project aims to transform raw human movement into actionable data using AI-driven analytics. The system was built using computer vision techniques, leveraging deep learning models such as YOLO for real-time human detection and tracking. Video streams are processed using OpenCV, with data pipelines designed to extract, filter, and analyse movement patterns. The application was developed primarily in Python, with integration of machine learning frameworks like TensorFlow/PyTorch to enable accurate and scalable inference. Throughout the development process, we gained practical experience in computer vision, model optimisation, and real-time data processing. We also explored how AI can be applied beyond traditional use cases to solve real-world operational and sustainability challenges. One of the main challenges faced was ensuring high detection accuracy in varying lighting conditions and crowded environments, while maintaining low latency for real-time performance. Additionally, optimising the model to run efficiently on limited hardware (edge devices) required careful balancing between speed and accuracy. Overall, Footfall Tracker demonstrates how AI-powered systems can enhance decision-making by providing meaningful insights into human behaviour, enabling smarter, more efficient, and scalable environments.

Built With

Share this project:

Updates