Inspiration
We believe that there is a better way to the walking experience. Google maps only provides aerial view, but in our app, we provide first person augmented reality view - this is much more optimal for your typical pedestrian.
What it does
Open up the mobile app and enter where you want to walk to. Google maps will determine the walking directions. The computer vision and machine learning algorithm will determine where the street or path indicator should be, rendered as a line on top of the real-time camera view. This indicator will guide the user and continue to update using snapshots of the user's street view, taken at intervals of 5 seconds.
How we built it
Took a react native app and rendered a camera view. Sent photos to the machine learning algorithm and provided predictive models for determining street, hallway and stair edges. The server is setup on node and interacts with wolfram API to guide the machine learning process. We also needed lots of sleep and human fuel to build this out.
Challenges we ran into
Starting the machine learning classification models was tough at first. Making the augmented reality camera experience feel realtime was hard to do.
Accomplishments that we're proud of
My team slept more than expected, and we used top of line machine learning technologies, neural networks, deep neural networks with over 40 hidden layers.
What we learned
Wolfram technology is a better dev language than Java. Wolfram technology is really powerful in helping create predictive models. React native flexbox styling is responsive and is the future of all frontend styling. Edge detection is accurate for figuring out the path of streets, stairs, hallways, etc.
What's next for Footsteps
Get acquired by Google.
Log in or sign up for Devpost to join the conversation.