We read an article about how we shouldn't walk where runners have recently passed to help avoid accidentally inhaling some airborne particles from the runner, and we thought, "how long do we have to wait until it's okay to walk where runners have passed?" Hence this idea was born.
What it does
Manifest COVID helps visualize the potential presence of airborne particles by generating "particulate" visuals in the wake of a person's movement that disappear over time.
How I built it
We used GCP to analyze video information and built the animation using p5.js.
Challenges I ran into
- GCP was a learning curve for us all, from setting up the project to be properly billed to using the Video Intelligence API
- tying together p5.js to generate the box coordinates / interpolating location on video, and overlay animation onto video
Accomplishments that I'm proud of
Not using GCP's AutoML API/tool/feature for this project (though this was fun to explore!). AutoML would've done something similar to what we accomplished, except we would've had less control and we wouldn't have had a chance to understand the environment through the GCP console CLI. The me in the past might've opted for this route because it's relatively easier and less intimidating, but today I'm going to be bold and say, I can do it either way.
What I learned
We learned more about the capabilities of GCP to analyze data as if we had the backing of hundreds of machine learning professionals and data scientists. We also learned about a very powerful visualizing/digital art tool that is p5, which also works well with ml5 (to apply future machine learning).
We also performed a video capture of our desktop for the first time, and in finding freemium app (ice cream app) to record a specific section of our desktop, we also learned that their logo appears. We promise we worked on this project - not ice cream apps!
What's next for Manifest COVID
Some applications include:
- implementing collision detection and maintaining a history of when they happen to analyze later what areas tend to have activity. This information may be useful for people in operations or urban planning to better understand user behavior and the impact of environmental design
- implementing at augmented reality level - e.g. playing pokemon go? Let's make sure you're not stepping into the wake of someone's path too soon just to arrive to that next gym and flash the screen red to show danger. A wearable like Google Glasses could use this, too.