We wanted to increase accessibility for people with blindness to communicate privately.

What it does

It allows for a user to input a series of vibrations to send to a person. These series of vibrations would be understood between users. It would be used in a "Morse-code" like way for users to communicate. Red circle button is to record. Green triangle button is to play. Blue copy button is to copy the link to send a message to someone.

How We built it

We used an Amazon Web Services. Our technology stack included JavaScript, HTML/CSS. We also used Web APIs to develop this.

Challenges We ran into

We ran into issues testing. We realized that different phone OSs use different browsers and have different versions. The web API did was not supported on all browsers. We found difficulties finding a device that supported the web API we used.

Accomplishments that I'm proud of

We are satisfied that we finished the product for the MVP. We identified tools to help us collaborate more efficiently and quickly create apps together (such as Cloud9 IDE). We are also proud that we worked well as a team where we all came together not knowing one another.

What I learned

Members of our team did not know how to use web APIs.

What's next for Haptic Messager

We would like to build a full version with all necessary components to scale for a large number of users.

Share this project: