I decided to write this app when I found out two weeks ago that a friend of mine was born with red-green colorblindness. He said one of the worst things about this color deficiency—even worse than not being able to distinguish the colors of traffic lights—was getting panked by his "friends" as a child.

The challenge of this app was to develop a way to represent unseen colors without relying on false-color images or color remapping since these methods work by replacing a color that the person can see with on they cannot.

My goal was to find a way to represent unseen colors without sacrificing any of the natural visual range.

Along with the help of my friends Eric Gross, Joshua Kirstein, and Aleksey Klintsevich, I was able to represent the desired colors by superimposed patterns. For example, I chose to represent the color green with horizontal lines and red with vertical lines.

The following screenshots show how my algorithm works on a live image stream from the webcam on my computer.

In future revisions of this program, I plan to represent the entire human color spectrum through a larger set of patterns for fully colorblind people, and ideally port this program to either smartphones or a web app.

Table of images:

1) Basic program test using colored paper

2) Unmodified image of an orange-and-red shirt

3) The same shirt overlaid with red and green patterns

4) An example of skin tone rejection (achieved through color sensitivity tuning)

5) Simultaneous red and green detection of complex patterns

Share this project:

Updates