Smart home technology has been largely popularized by Amazon, Google, Microsoft, IBM, and others. However, we saw that there was little infrastructure in place for users who are deaf or hard of hearing. To increase the accessibility of smart home technology, we set out to create Eddie, named after Thomas Edison, the famous inventor who was also hard of hearing.

What it does

To use Eddie, the user motions towards a motion sensor with their hand one of the built in gestures that corresponds to a command. For example, the screen touch command (pointing horizontally forwards) corresponds with the command to get the weather. Then, Eddie gives the output in the form of a computer screen with large text.

How we built it

Eddie was made with JavaScript, HTML, and CSS, and utilizes the LEAP motion sensor. The APIs we used to complete the tasks that we programmed Eddie to take (return the weather, a random dog GIF, and a recent news headline) were the Open Weather Map, Giphy, and News APIs. Using the LEAP motion sensor, Eddie was able to distinguish between hand motions and, with the use of the various APIs, complete tasks given to it by the user.

Challenges we ran into

When we set out to build Eddie, we planned on writing mainly in Python. However, we found that the LEAP motion sensor's python could did not run on our computers. Luckily, we were able to work with the LEAP motion sensor in JavaScript and HTML. However, in one of our original ideas for Eddie, Eddie gave a text message output to the user that could be seen on a fitbit or smart wearable device. Using JavaScript proved to be a problem because we could not use the Twilio API with the amount of time given. So, we decided to display Eddie's output on a computer screen.

Accomplishments that we're proud of

We are proud of our group's ability to start out with a plan and be able to accommodate problems with the original idea as we went along, like changing from a text message output to a display on a computer screen. Overall, we are very proud of what we have accomplished at BostonHacks (for all of us, this was our first hackathon!).

What we learned

We learned how to work and create a project outside of our comfort zone, working with APIs we had never worked with before and the LEAP motion sensor, and be able to help make technology more accessible.

What's next for Eddie

There is a lot of room for growth for Eddie. In its ideal form, it could sense the hand movements from the user from farther distances anywhere in the room, rather than be limited to the relatively small radius from the LEAP motion sensor. In addition, Eddie in the future would have text message output capabilities.

Share this project: