Inspiration

Color deficiency is when the photoreceptors in the retina of the eye are slightly deformed such that they cannot see certain wavelengths. People with color deficiency struggle to see either red-green wavelengths or blue-yellow wavelengths. While color deficiency does not typically impact lives severely, it deprives those with it of seeing the world in the same light that others do. Solutions have already been created in the form of special ‘sunglasses’ that modify incoming light to more closely map to what an actual person would see; however, these glasses can end up being hundreds of dollars.

What it does

Our app uses a simple filter that modifies the images it takes in from the camera in real time so that a user with color deficiency can see what a normal person would see. For instance, if you climbed to the peak of a mountain and you want to see exactly what it would look like, you could pull out your phone and see it.

How we built it

We built our application using the Unity engine in C# with the corresponding AR frameworks. We used canvas elements to create a filter to overlay over the main camera, creating better contrast for problematic colors, making them easier to differentiate.

Challenges we ran into

Capturing the input from the camera to produce a filtered image was a challenge, especially in trying to use render textures to apply a filter. We pivoted towards modifying specific pixels, proved to be too time consuming to deliver a functional product in time. Finally, we were able to switch to using canvas elements to produce a filter overlay for the feed from the camera to the screen.

Accomplishments that we're proud of

Delivering a functional product, especially after running into issues with our initial development plan.

What we learned

We learned that render textures are very difficult to operate with.

What's next for ColorDeficiencyAR

We plan to add specific modes for a variety of color deficiencies.

Built With

Share this project:
×

Updates