Inspiration

We want to enable people with disabilities to access information efficiently, accurately, and conveniently.

We were inspired to create our hackathon project from Gus’s presentation, where we saw how important access to digital information is for blind people when they’re out and about. However, it is inconvenient to have to find and pull out a phone and interact with the mobile screen reader to access that information.

How it works

We built a system where a smart phone (or any other smart device, for that matter) can be control through an armband. The user wears the armband and performs gestures to interact with the phone. The gestures include waving your hand to the right and back again, rotating your hand, and so on. They are simple and easy to perform with one hand while the other is holding a cane or a guide dog.

The user receives audio feedback from the screen reader in response to their arm gestures. Also, the armband provides haptic feedback by generating short vibrations when a gesture is processed.

Challenges I ran into

We ran into many challenges. Detecting the gestures is difficult, because each person has their own way of moving their body and the signal from the sensors on the device are noisy.

Another challenge was connecting the phone and the armband in a reliable way. Sending information wirelessly across two devices is possible, but would require more than a weekend to troubleshoot and do robustly.

Accomplishments that I'm proud of

We managed to create a prototype where a user can control a smartphone using 3 gestures! The gestures are:

  1. Move hand left and right – Move selection cursor to next item on the phone screen.

  2. Rotate hand – move selection cursor to previous item on the the phone screen.

  3. Make a fist – activate selected item.

These 3 gestures correspond to the 3 basic on-screen gestures people use to interact with a mobile device screen reader.

What I learned

By talking to Xian, we learned that our system can potentially benefit people with other disabilities as well, although we would have to redesign the arm gestures. For people who agrip poles as they walk, for example, we need finer gestures such as finger movements that can be performed while the user is still gripping a pole.

What's next for GOsmart

We plan to improve GoSmart so that we include more gestures and make the system more robust. We will also interview people with a range of disabilities and design custom sets of gestures that the user can select, depending on their preference, to interact with the phone.

Built With

Share this project:
×

Updates