About this Project:
We were inspired by ucla secrets. it's kind of like ucla secrets but you have to visit certain avenue in ucla to hear the story so the author is tied to a specific location
What it does
Dispatch allows for a collaborative effort between writers and runners in a given community to write and enjoy original stories. This collaboration takes form in a story feature that allows any individual to create a story of a genre of their choosing that when published, can be listened to by anyone in their community as they run. Has a writing feature in which anyone can contribute to the database of stories that are listened to by runners. Allows for runners around in a given location to gain access to stories that were written by community members. Creates a running path for the runners and plays the story in the background as they run with a costume UI. Shows the weekly statistics of the user’s stories and runs.
How we built it
Most Notably, we used google’s deep mind Machine Learning module to synthesize Speech from Text - A very important NLP problem. The api that was used to access this machine learning architecture, also known as the Google's Wavenet, was properly suited for our application. Using React Native, Google Cloud, and multiple API’s our team divided up the work between frontend and backend. Throughout the week of the hackathon the team met everyday to discuss our progress and goals to help guide ourselves in implementing all of the features we wanted. As the team was divided between frontend and backend, the frontend team was further split between different mainscreens that needed to be worked on. It was the responsibility of each member to ensure that their screen was functional and as a team we checked that no merge conflict would arise. Together we created an app that is cohesive in both design and function.
Challenges we ran into
Connecting to Google API Coordinating between the frontend and backend. We come from different time zones and also we have to work during weekdays.
Accomplishments that we're proud of
We coordinated the work from 5 people into a working app in a short duration of time. The app uses state-of-the-art API from Google to handle NLP problems like translating text to different languages (Google Wavenet) and natural voicing to handle database transactions using Firebase. We also combined Google map API and third party library to query closest locations to a given point of interest. All of that culminates into an app connecting runners to story writers, innovating a new way to enjoy during our daily exercise but also learn more about the community.
What we learned
We learned the various intricacies of react native hooks. We mainly learned how to use the Google Cloud DeepMind-WaveNet API and how to configure it to our likes. We also learnt how to use different npm packages to display various UI like bar graph, line graph,etc on our front end.
What's next for Dispatch
We want to incorporate a real reward for writers to give them greater incentives to write. the more the listener his or her story attracts, a bigger reward he/she gets. We also want to add interactiveness to the story. for instance. user may be given a fork. for instance the hero in the story maybe ready to fight the dragon. if the runner decides to make right turn. then the hero swings his sword. if the runner decides to make a left turn. the hero tries to convince the dragon to let his princess go
What languages, frameworks, cloud services, databases, API’s, or other technologies did you use?
Google Cloud’s DeepMind - WaveNet API Google Cloud’s Direction API Google Cloud’s Firebase React Native / Expo Node Express Server Expo Audio Expo FileSystem Expo GPS API React-native-chart-kit