Our entire team is from New Jersey, so we came to the Tech Crunch hackathon from Penn Station, which required us to walk to subway stations in the rain, when our hands were full with our bags. This made it especially dangerous to navigate through the streets of the City. Upon doing more research, we found 17,000 pedestrians were injured in the past year, so we created a system that allows pedestrians, the visually impaired, along with people biking, to navigate through streets without ever pulling their phone out to distract them from the road.

What it does

BackMap is a system that includes a backpack with a vibration motor in the right and left strap of the backpack, an application to allow inputting navigation data, and a set of beacons. This allows us to create two different navigation scenarios, one indoors and one outdoors. For the outdoors scenario, the user inputs a location, and the backpack leads you to that location by vibrating the right strap whenever you have to turn right, and vibrate the left whenever you have to turn left. This has applications for the visually impaired to navigate busy streets, along with allowing general pedestrians and bikers to also be able to navigate streets. This navigation and routing is performed using the Esri directions PubNub blocks for the app. We are getting directions from this SDK, and using this received data to vibrate left and right straps as need be. In addition, the backpack system was built using a rasberrypi connected to two motors, one in each strap, which are then controlled using relays.

With the indoors scenario, we set out to solve the extremely difficult scenario of allowing navigation indoors. Currently, most beacon based systems tell you when you approach a specific beacon, but do not tell you how to actually get to the beacon. Using the same backpack system as earlier, the indoors version of BackMap allows you to navigate to specific locations indoors. This has applications in accessibility with highlighting locations such as bathrooms, specific sections in a mall, or the different sponsor tables at an event like this hackathon. Using various simulated beacons on our phones, we are able to allow the visually impaired to locate specific areas of a venue indoors by using triangulation of data points along with compass data. This indoors scenario has infinite possibilities, as it can allow the visually impaired to locate indoors, with the same haptic backpack based system. To accomplish this, we are using the unique id's and keys being relayed by the BLE beacons, and using the compass data and the strength signal data from these beacons to accomplish navigation.

How we built it

We created an android app that allows both indoors and outdoors navigation. For the BackPack, we used a rasberry pi hosting a node.js server to control motors in each strap of the backpack. For outdoors, the app uses the Esri PubNub block to receive a map, and also receive directions from a location to another, and vibrates when you are at the corner of a block or have received a direction. For indoors, the android app detects simulated bluetooth beacons, and uses strength signal from then along with compass data from the app to locate where the desired beacon is.

Challenges we ran into

The most difficult challenge we faced was actually use the beacons for allowing navigation. Soon after we started working with beacons, we realized there was no way to get any sort of direction data from the beacons, even mathematically without creating a separate app for every beacon device. What we did instead was design a grid based beacon system that would simplify setting up beacons for accessibility for people hosting events or owners of public places. In addition, this grid based system would be cheaper, and more accurate than using beacons at just corners. The greatest part about this is that the beacons are used for nothing but location in our app, so they are still open to be used for advertising and any normal beacon purpose that the business or event sees fit. In addition, with the grid based mesh system we created, we can infinitely scale our system. With beacon devices being cheap (around $50 each), in a venue sized like techcrunch, you would only need about $500 worth of beacons to make any place accessible to blind people.

Accomplishments that we're proud of

Coming up with the grid based beacon system for indoor navigation, along with creating a $40 haptic feedback system for backpacks that anyone can implement.

What's next for BackMap

We would love to scale up the system and actually try to use the backpack based navigation system indoors in real events to test and improve the product, in ordet to be able to help more people.

Share this project: