As someone with no walking disability I get frustrated walking from aisle to aisle looking for a certain product at a big supermarket. We can only imagine how hard it must be for someone who has a WD. We want businesses to be aware of people with WD's entering the store. This way they can send a member of the staff to help them find a certain product quickly.
What it does
If a customer with a walking disability enters the store our program detects it. This output can be used to notify someone from the staff (however the notification feature is a thing planned for the future)
How we built it
We have used Python Flask to create our backend, Wrnch AI's API to detect the alignment of the joints and various Python libraries to visualize the data (time-series) helping us draw conclusions on how to approach the problem.
Challenges we ran into
This was our first time working on a computer vision problem. We had to record ourselves moving towards the camera in various ways to determine how the angle between the joints vary.
Accomplishments that we're proud of
All the data visualization really helped (thank you Matplotlib), allowing us to achieve satisfactory preliminary results. Even though our parameters can be tuned and optimized further, our program works for the basic case (i.e. a single person walking towards the camera with a walking disability)
What we learned
How to approach computer vision problems, back-end development and implementation of Wrnch AI's API.
What's next for Let me bring that for you
It would be great if we can extend our app to public transit (metro's mainly), that way someone who is trying to enter the train can walk towards the doors worry-free (without the fear of being squashed).