The many API's we could use to work together to create a helpful hack. It would be silly to not use it to help others.

What it does

Helping blind people to interactive with environment. Describes everything they cannot see as they walk into the smart room which occupies the estimote. It would also give a welcome and goodbye to the user as they enter and exit the room. A lot of work has gone into the backend and with bad wifi it takes roughly a minute for the call to make its way around the many api's we have.

How we built it

We have two clients, a mobile app to and a OS X application but a big server which had a python script in the middle to return the image recognition.

Challenges we ran into

So many problems such as an overload of stacks causing Remi to completely change our server side at 5am. Rough times but somehow we pulled through.

Accomplishments that we're proud of

The many api's we managed to implement. I am especially proud of the image recognition. Spotify was implemented in the last 5 minutes which was impressive.

What we learned

So much. So, so much. How easy python is... How hard Swift is...

What's next for smart room

Share this project: