Five percent of people in the world suffer from some kind of colorblindness, including one of our close friends. Being colorblind introduces inconveniences into modern life, making some of the easiest tasks impossible. We realized that these inconveniences could be addressed with augmented reality.
What it does
Chroma is a camera and augmented reality app that enables the colorblind to easily distinguish common color combinations: red-green and blue-yellow. Chroma applies distinct stripped patterns to these color pairs so that they can be easily be distinguished. Chroma also features AR mode and can be viewed through any virtual reality googles.
How we built it
Chroma is built on top of nekocode's "CameraFilter" application, which provides an API for applying an OpenGl filter over an Android camera preview. We then developed our own filters that isolate colors based on hue and then overlays the corresponding pattern. Then we combined these elements within Android Studio to create a coherent experience. Finally, we developed a secondary filter that is applied in conjunction with the color filters to create the an AR environment compatible with Google cardboard and Samsung Gear VR.
Challenges we ran into
We had initially decided to learn and use Flutter/Dart, so we spent most of the first 10 hours learning the basics of Flutter. However, we realized that Flutter currently has limitations and is severely lacking in documentation as it is still in development. At this time, we made the decision to switch to Android Studio and re-start development, which sent us back almost to square one. The other major challenge we ran into was in developing for OpenGl. We had no way to directly debug, and our distribution would crash if we used any integer types, severely limiting our efficiency. Nevertheless, we were able to get the filters functioning as we desired.
Accomplishments that we're proud of
We managed to create our own OpenGL filters. Neither of us knew OpenGL, C++, or Dart coming into the hackathon, and getting them working was one of our greatest challenges. However, we were able to learn them. We also discovered that our filters not only filter color, they have very good edge detection producing clean pattern mappings.
What we learned
We both came in with the intent of learning new software, which we accomplished. We both learned app development in Android Studio with and without Flutter. As mentioned above, we also learned the OpenGL api, C++, and Dart.
What's next for Chroma
Now that the base filter functionality is developed, there are several next steps. First, the AR mode can be improved with support for external cameras and further work in reducing distortion. Next, the filters can be improved by reducing noise detection as well as increasing edge fidelity. Development for Chroma will be continued on Google Play.