We were inspired by our mothers, who are both educators for children. Many people want to know what their children do at school, since at younger ages when parents ask their kids "What did you do at school", the question is rarely met with anything beyond shrugs and incoherent ideas. This responsibility to communicate then falls to teachers. We looked at the other products on the market and thought of a way we could use AI and Machine Learning to automate the process, helping teachers share student's foundational education experiences with their guardians.
What it does
Our app and camera system, monitors your kids throughout the school day and notifies you when there are noteworthy events with a collection of photos of your student that's been personally curated by our learning system.
How we built it
We built this technology with Android Studio for mobile app and python for the data processing/machine learning back end. The back end was made with Google Cloud Vision, Sklearn, Gensim, Facebook's bAbI dataset, and communicates to the mobile application via Firebase's Realtime Database.
Challenges we ran into
We had to make so many parts work fluidly together in a short amount of time. We also ran into some technical challenges that took a couple creative innovations to make it through. Lastly, my computer restarted unexpectedly at least 5 times, probably because I was trying to do so much on it.
Accomplishments that we're proud of
We are proud that we were able to make a system that will help with parental communication in elementary school classrooms and hopefully in the future offset some of the major work done by elementary and pre-k teachers (who are much in need).
What we learned
A lot. We can't wait to tell you, but here are some hints (NLP, App Dev, and Family).
What's next for xylophone
We hope to finish up some of the fixes, beta test it as a project for our schools elementary schools, as well as learn about Users Experience when using the app in the real world.