Inspiration

My cousin is on the autism spectrum, 19 years old, and is currently applying to college. His mother's greatest fear is letting him go out on his own because he has difficulty reacting properly to stressful situations, such as getting lost or being approached by the police. She fears for him getting lost, getting hurt, or being taken advantage of. He is highly functioning and eager to enter higher education, but with his difficulty interacting socially, he may not be able to live independently.

I want to provide peace of mind. When in danger, I don't want him to have to worry about trying to tell the emergency services the exact information they need to know about the situation. When lost, I don't want him to worry about stumbling into a wrong neighborhood and never finding his way back home. And most of all, I want his mother to feel comfortable about letting him leave home, and pursue the life as an independent adult he is meant to live.

So for our project, we decided to create a IoT accessibility app for people who are in dangerous situations and need to communicate discreetly and non-verbally with emergency services.

Our Vision

Our goal was to design a device that acted as a phone that would send a personalized message to emergency services when a trigger sequence was activated. This trigger sequence could be a pressing of the volume buttons in a certain order, all the way to squeezing too hard on the phone activated force touch for an extended period. The activation sequence would have user response, in that when the trigger sequence was begun, a vibration on the phone would occur signaling that the phone is about to call the police.

The message sent to the police is a pre- recorded audio file entered beforehand into the device, populated dynamically with location services and specific health informations such as disability or mental illness. Alongside the message, the person’s health data, location, and any other pertinent information regarding the individual would be sent as a information packet to the 911 services that could be used to guide police in their approach of the subject calling.

What We Used

  • IBM Watson for text to speech
  • Twilio for automated calls
  • Qualcomm Snapdragon 410c for hardware human interaction application
  • Stdlib for configuration
  • Mongo DB to create database to mimic medical information API

Challenges

We had difficulty at first getting used to the Snapdragon 410c board. Our attempt to boot from SD card to change the OS failed and took up some time. We attempted to switch to a Raspberry Pi, but the hardware lab was out of power cords and the Raspberry Pi did not have enough power to start. Another difficulty was getting the Snapdragon to interact with the Microsoft Azure linux virtual machine. Another challenge we faced was that none of the major personal medical API's were open source, and we were forced to create medical data through a separate database.

Accomplishments

We learned Stdlib and Twilio, how to work with a VM on the cloud, and how to interface with the Snapdragon. We were able to create the device to press the button, and were able to create the phone call from text to be sent to emergency services.

Moving Forward

Moving forward we want to get the device to communicate properly with the stdlib so that the connection between hardware and software could be made. We have learned that we should try to find a team member with experience using JSON and is a computer science major, as we were all hardware heavy.

Built With

Share this project:
×

Updates