Inspiration

Every member of our team has once encountered an experience where we had to assist an elderly person that had trouble using whatever communication app was installed on their phone. Sometimes, no matter how much assistance we give, it is still difficult for them to take advantage of all the functionalities of the app they use. Sometimes they give up on new technology altogether because it is just too much. This is unfortunate because we don’t want them to give up just because they do not have the same intuition that we do when figuring out the basic features within an app. We want them to be able to use the technologies that we use, not exclude them from them. Not only that, but sometimes those apps can be their only way to contact their loved ones and they have to go through too many steps to get a hold of their healthcare professionals.

What it does

The main goal of the app is to facilitate communication between individuals and health professionals to the best of Android's capability for the elderly. Gone are UI components like "hamburger" menus and actions hidden behind swipes. Every action is visible instantly in every instance of the app. Furthermore, everywhere, the user has access to context-specific help and a smart bot capable of understanding spoken commands using Google Cloud API. These spoken commands trigger actions like calling contacts or emergency (by saying "please help" for example). The app is also a great tool to manage interactions with health professionals. You interact with health professionals the same way you would with your contacts. That way, they seem much more approachable and may seek more treatments for their ailments.

How we built it

We used Google Cloud's Dialogflow to detect intents based on text queries. With Google's speech-to-text recognition, spoken user commands can be interpreted by Dialogflow. Dialogflow can understand the intent of the user request and with it, the app can instantly perform the appropriate action. For example, after saying "Call Martin", the intent and the name parameter are identified and the call activity is launched.

Challenges we ran into

Even though we all had previous experiences with Android development, it was very difficult to get back into it. A lot changed, especially the IDE (Android Studio). It took us a considerable amount of time to be comfortable again with the IDE. The intricacies of Android like RecyclerViews, Fragments, Activities, and XML layouts also took us a while to fully grasp again.

Accomplishments that we're proud of

We are proud of our implementation of Dialogflow. Knowing that spoken commands like "please help" are understood and trigger immediately an emergency call to 911 is really gratifying because it could really help a lot of elderly people that need a reliable and easy to understand app for communication. We are also very proud of the general usability of the app. The user experience is devoid of any complex components that elderly people do not know about or are uncomfortable of using. The fact that the speech recognition and help instructions are available everywhere is also a great achievement.

What we learned

We learned a lot about Google Cloud, especially Dialogflow. We are now also much more comfortable with Android Studio and Android programming in general.

What's next for Papiyon

It would be awesome if Papiyon could implement more smart actions, like the ability to understand voice commands for automatic appointment booking, automatic driving directions to different health professionals, etc. Also, the ability to give more accessibility options for different types of communication, like live subtitles during voice calls or speech-to-text SMS messages.

By the way, how did you come up with the app's name?

Since we wanted to eliminate social isolation among elderly people, we wanted previously isolated people to become social butterflies. "Papiyon" means butterfly in Creole.

Built With

Share this project:

Updates