We wanted to find a new, innovative way to help the blind "see" and navigate their surroundings

What it does

This Android app uses google ARcore to sense objects in 3D space, calculate the phone's distance to those points, and represent the phone's distance to the objects through different pitches in sound.

How I built it

We built it using android studio and ARcore. ARCore allowed us to map objects into coordinates, which allowed us to calculate it using the three dimensional Distance Formula.

Challenges I ran into

ARcore had surprisingly little documentation

Accomplishments that I'm proud of

We were proud of learning Android App Development in such a short time, and could represent that with sound.

What I learned

We learned how to develop Android Apps using Android Studio. We also had no experience with ARCore before.

What's next for CloudsEye

We can improve the accuracy of the distance measurements, and use surround sound to represent where objects are rather than just the average distance of all points represented as 1 sound.

Built With

Share this project: