Inspiration

This project was inspired by reality and science fiction. We took the original idea from reading an article about the F-35 Helmet Mounted Display. The graphical overlay was such a cool idea, and we realized that this could have many applications for civilian use as well. We also drew from the living pictures in Harry Potter.

How it works

The app uses BoofCV framework to do a lot of the heavy lifting involved with Image Processing. We used a pattern detector and calibrated it to look for a flat chess pattern in the camera input. We then calculated the 4 corners of the pattern, if it was found, and then overlayed the body of the image with a beautiful Nicholas Cage picture. We were sure to split the screen in 2 to make it compatible with Google Cardboard.

Challenges I ran into

Setting up every single framework was like pulling teeth. Once we got them installed correctly, they worked like a charm. We initially tried to use OpenCV for Android to do image processing, but it used a lot of deprecated system calls. Ultimately it was finicky and difficult to set up, so we moved on to BoofCV.

Initially we were simply drawing Bitmaps to a canvas object for our output, but the performance suffered. We made some optimizations and looked into using OpenGL. We started by using a framework called Rajawali, but we found that Rajawali is still too new and didn't have support for using the hardware camera. We had to write everything in raw OpenGL, and we were able to get the camera preview drawn on the screen, but we didn't get far enough in the time we had to finish implementing the same functionality that we had using the canvas.

Accomplishments that I'm proud of

This was the first real Android app that any of us had made which is an accomplishment in and of itself!

This is the first Google Cardboard + Android Wear app that we know of.

Pattern tracking using BoofCV was a huge step. Once we had 24 dots marking the intersecting points of the chessboard pattern, we knew we had somewhere to go off of. Finding the 4 corner vertices of the image was also a great accomplishment since we could then send these straight to OpenGL to render the virtual screen.

Finally getting the camera preview drawn in OpenGL was quite the accomplishment because we added functionality that wasn't present in a popular framework, and there are very few resources online for getting the camera preview into OpenGL.

What I learned

  • Murphy's Law is a real thing. Computer Vision is difficult but rewarding.
  • The Android development process.
  • The Android Wear development process.
  • This was all of our first time working with OpenGL ES 2.0 for Android
  • This was all of our first times working with all of the SDKs that were part of the project

What's next for Google Cardboard Augmented Reality

Smart Virtual Sticky Notes. Android Wear Remote Control Integration. Augmented reality 3D watch faces for smartwatches.

Built With

Share this project:

Updates