Inspiration
We thought that apps and algorithms that drew conclusions based on the music playlists of users were interesting. However, we realized that photos tell much more about people than the music they listened to. As a result we decided to create an app that analyzes a user's google photos in order to gain insight on what they may be like as a person, whether for personal reflection/analysis, or just for fun.
What it does
Our web application takes images from a user's google photos and analyzes it using google cloud vision API and google photo's content classification. Then we take the results and run it through our algorithm to determine everything about their lives; from personality to themes.
How we built it
We used React.js and integrated google cloud vision API as well as google photos library API. All this is coded primarily on as a front end application, using REStful services from both APIs. We also integrated google oauth so users can login with their google account and sync their photos to our web application. Once this is done, these photos are sent to google cloud vision API for further analysis. Once all data is collected it is run through our algorithm, and various aspects of the images are analyzed to reach certain conclusions
Challenges we ran into
- setting up and requesting to google APIs
- managing stateful data within react
Accomplishments that we're proud of
- making a fully functioning site that utilizes multiple APIs
What we learned
- How to use google APis and authentication
What's next for Pythia
We are looking for more ways to interpret and dissect photos to draw more accurate conclusion from them. We are also looking for new ways to analyze our current data and provide better insight into the photos.
Log in or sign up for Devpost to join the conversation.