We have noticed how difficult it is to travel with a temporary or permanent leg injury to travel from home to school without depending upon anyone independently, so we want to do a project to make life easy in such a situation.

What it does

We are using the Google Streetview API to grab photos of a college campus, that we then analyze using the Azure Computer Vision API to determine whether the photo is an accessible scene or not. We then use the data from the location photos to determine the most accessible route a user is searching on using our accessible website.

How we built it

We brainstormed in many online meetings and after choosing our idea, broke into teams to divide up the major parts of the project.

AI Dog collects geo location data on campus using the Google Streetview API which we then pass to the Azure Computer Vision API to analyze the returned Google Streetview API images to determine if the location is accessible or not.

Some things we look for in the analyzed images may be sidewalks, crosswalks, ramps, lack of stairs, etc.

We store the results of our campus evaluation in an Azure Cosmos database for use in our web app.

The AI Dog interface is an accessible website where a user can easily find an accessible route to where they need to go on campus.

For the Frontend we set up a Static Web App in Azure which hosted our HTML, CSS and JS files from our GitHub repository.

What languages: Python, Java Script Frameworks: Bootstrap, Vanilla JavaScript, platforms: Web cloud services: Azure, Google Colab Databases: Azure Cosmos DB APIs - Azure Computer Vision API - to process and get data from images Azure Cosmos DB - storage Azure Functions - hosting our server side code Azure Static Web App - host our web application Google Streetview API

Challenges we ran into

Deciding which projects to choose out of four options as per our inspired topic Efficiently processing and storing our Unstructured (location and image) data (it was huge!) How to connect out backend work with our frontend work Collecting a enough data for multiple colleges was a challenge as it has a longer runtime than expected Parsing the JSON response from the Google Streetview API Finding the closest latitude and longitude to the users latitude and longitude Making the Cosmos DB calls more efficient (they were initially close to 20 seconds, but got them down to a few seconds) Consuming the Azure Function API in our html app. We had issues with the POST request, but eventually did solve this using an AJAX request Encoding the images to be saved in the Cosmos DB was overall a new learning challenge Determining which locations in we were mapping were relevant to our search

Accomplishments that we're proud of

Coming together and working as a team with random teammates from very diverse backgrounds and abilities by learning from each other through Azure technologies.

What we learned

Much better understanding of the various Azure services, including: Azure Computer Vision, Cosmos, Azure Functions, and Static Web App.

What's next for AI Dog

We would like to add more college/university campuses to our database, and look to expand beyond higher ed into other locations as well.

Built With

Share this project: