With the upcoming technologies and improvements in NLP, we noticed a deficiency when people are trying to report a problem. When its impossible to text (e.g. on the road) reports can be missed when they're in the moment. The other medium, calling, suffers from a lack of immediateness. Automated phone calls are severely bottlenecked and some still require direct interaction with you mobile device.
What it does
Hence, we come to our app: NotifyMi. Where we integrate with Google's Assistant for immediate and efficient communication between users and a reporting service. By having natural conversations with the user we can gather exactly the required information to make a proper report - including location support - and submit it in much less time than conventional mediums (texting/calling). People on the road can now drive entirely distraction-free and still let others or first responders know about urgent information (e.g. icy conditions on a road, an impaired driver and their license plate).
How we built it
We used Android Studio to make an app that provides an aggregation of reports as well as give push notifications to users who want updates. Secondly, we used Dialogflow offered by Google to create conversation-like paths for users to report issues on the road. This notes down the type of report and location which is stored in Firebase.
Challenges we ran into
With NLU (natural language understanding) and Dialogflow V2 only being released very recently, we found a deficiency in resources for creating our app using Google Assistant
Accomplishments that we're proud of
It works! We can use Google Assistant to report several categories of issues and store them in Firebase. Further, the app will display them to the user and show them on Google Maps. When an issue is reported enough it will send a notification to all users notifying them about the problem.
What we learned
Developing something with new technology is NEVER easy. No matter how available it is.