Inspiration

Find Your Seat Challenge: Kate helped us personally so we would understand all the different ways someone who’s blind navigates, whether it be by compass (N, S, E, W), a clock face, or degrees. We wanted to make sure all aspects of getting into a dining hall and finding a seat and know who’s there is covered. She needs to know what’s to eat and a description of each item, find an empty seat, and not only get a seat but know who’s sitting with or around her.

What it does

Using an Arduino UNO board, along with a HC-SR04 ultrasonic ranging sensor. Our prototype emits an ultrasound at 40 000 Hz which travels through the air and if there is an object or obstacle on its path It will bounce back to the module. Considering the travel time and the speed of the sound you can calculate the distance. The sensor provides 2cm to 400cm (up to 13ft) of non-contact measurement functionality with a ranging accuracy that can reach up to 3mm. The device picks up on any object and notifies the user with a tone so they are aware of an approaching object. On the prototype we also implemented a SH_BT_Board Bluetooth transceiver. The Bluetooth capability will allow our devices to relay information collected to an application on the phone, thus being accessible to the user. It will work like a Yelp app for college students. The menus would be uploaded to the app and will be accessible via a screen reader. Students will be able to review food in the dining halls. Users can also connect with friends. It also has the ability to open a camera that controls facial recognition and assists in finding a seat.

How we built it

We built our idea off of the Microsoft Seeing AI. Seeing AI will help tell you what’s in front of you while you are walking, will help will facial recognition, uses the smartphone’s camera to detect and describe nearby people, text, and objects. We used sonar technology to determine the distance and direction of the object. It will tell the user how many steps/feet in front of the person the object is and what direction it’s in using the preferred direction (clock, degrees, or compass). Sketch enabled us to get a rough outline of our app and we are able to traverse though the menus. It also made it possible to test it on a phone with a screen reader.

Our Ideal Situations

An all inclusive app that everyone would want to use. All, or most, of a users friends would use it and have a profile picture to assist with facial recognition. The app will have the vibe of a Yelp app for college students. School menus will be available through the app and students are able to review the food. Facial recognition will be advanced enough to work as a user is moving through their daily life. This will be done through a camera they can wear. Necklace type, Bracelet type, and Glasses type

Challenges we ran into

We were shocked by how descriptive TapTapSee is, however it was incredibly slow, but Microsoft’s Seeing AI was significantly faster. The major downfall of both of these apps is that they had issues describing large, involved scenes. Both apps can distinguish between an empty and a full chair, however they could not pick out an empty chair in a large scene. We are aware that our app requires that all of our ideal presumptions to be there, but that's not always going to be the case. Schools may not always update their menus and we don't know how attractive this app will be to everyone. During testing of the Sonar distance sensor we had a big of a casualty with the portable battery and the sonar sensor.

Accomplishments that we're proud of

We are proud that we get a rough prototype of our future application as we have never been able to pull together any for of our own application so this was a huge step for us. Many of us were also working with hardware for the first time and got it work…before we broke it, using Bluetooth, using sonar technology. Our biggest accomplishment has to be that we were able to address all the major points Kate brought to our attention

What we learned

Each of us learned something new this weekend. We collectively learned how to work with hardware. Emily taught us how create a prototype of an application and how to get it on your phone for testing. We grew as a team and stepped it us from our last HackAThon where we had nothing working.

What's next for iSee

We want to take the prototype of our app and make it a real working app. We would also like to take the concepts from the Microsoft Seeing AI application and implement them into our own. We will use our sonar technology to strengthen distance perception and enhance the facial recognition.

Built With

Share this project:

Updates