Inspiration

The inspiration for "Feel the Way" stemmed from a desire to harness cutting-edge technology to address real-world challenges faced by the visually impaired community.

What it does

"Feel the Way" is a wearable device that translates the visual world into tactile sensations. Using the LiDAR sensor on an iPhone, it scans the environment and converts the detected obstacles and pathways into a series of vibrations. This tactile map empowers visually impaired users to navigate indoor and outdoor spaces more confidently and independently.

How we built it

We knew there would be three main systems to extract lidar data from the iPhone, the actual app running on the phone, the microcontroller/computer to run the hardware, and the actual physical components. The iPhone and microcontroller would have to pair over BLE, as IOS has strict requirements for full-on data transfer between a Bluetooth connection, BLE avoided this by creating a server-client system between the phone and microcontroller. After hours of reading documentation and experimenting with an Arduino nano and Raspberry Pi 4 for the controller candidates, and due to issues with the RPi4, we settled on the Arduino nano.

The lidar data was taken from the phone and translated into a resolution we could fit into a tactile interface, which in this case was 5x1 resolution due to component and time limitations. We did this by averaging the distance data from the lidar camera into five zones and sending that data to the Arduino to turn the motors a specific amount to apply less/more pressure on the fingers based on how close an object was. (we have an image with a sort of description of how the tactile interface would be laid out with 10 motors).

Challenges we ran into

The BLE protocol was extraordinarily difficult to work with using an IOS device, we encountered many issues from forming the Bluetooth connection to sending lidar data from the device. This included spending hours attempting to get a BLE GATT server running on the Raspberry Pi only to find there was an issue so deep in the code it would require a course to understand.

We opted for the Arduino and were able to send individual data points to the motor interface and control each of the motors, however, due to some unknown Swift limitations when attempting to stream the byte-by-byte lidar info no data would be communicated.

Accomplishments that we're proud of

All parts of the project were completed, from our software (at least what we planned for) and firmware and hardware, and were able to record and analyze the lidar data and send it in a format that the firmware could interpret and send to the hardware, we were only limited by BLE issues.

What we learned

We learned a lot about designing beta apps for IOS as well as working with the Bluetooth protocol Arduino, and the RPi4 Bluetooth stack BlueZ. All of these fields we had little to no experience in at all, so what we learned and accomplished was incredibly impressive and valuable experience.

Whats next?

Built With

Share this project:

Updates