When I worked as a tutor in my community college, I learned about the differences between learning and teaching styles as part of my training. After tutoring, I would go back home and meet my roommates, who love to talk about politics, religion, or the latest news about the economy. I could rarely follow their stories with the same enthusiasm they did, because their teaching style (talking to each other), did not match my learning style (even though I suspect they were not listening to each other either.) To learn, I have to be completely immersed in the information; therefore, an app for VR (virtual reality) was the best solution for my learning style.

The initial idea was to create an algorithm that converts articles (text) into equivalent images and animations, and then eventually apply that to books. That is, assuming most people can write but not every one can teach (the act of talking is way different than real teaching teaching), Aha!News would convert text (like a scanner), into a visual representation to match a visual learners' need (live information, graphics, animations, videos, renderings, etc.) However, due to the level of complexity and time needed to develop such application, I focused solely on removing all visual distractions and showing only text to the user.

According to my research with over 90 students from Michigan and New York, students tend to seek help (academically) from sources (friends, family, smartphones) they can trust and that will not judge or expose what they do not know. With VR and the internet, they will have an unparalleled level of privacy with no judgment and a virtually unlimited font of live knowledge. Moreover, we are about to cross the chasm of a revolution in information technology. We, hackers and educators, can not miss this opportunity.

What it does

As one of the first news app available for the Gear VR, Aha!News beta (and proof-of-concept) shows the abstract of top stories in technology from The New York Times. It also allows the user to instantly look up the meaning of any word from the top story's abstract.

How I built it

Aha!News was built with Unity3D Game Engine, C#, Pearson's Dictionaries API, The New York Times API, and a bunch of chocolate chip cookies. It is designed for the Gear VR (head mounted display) which, as of this writing (January/2016), is compatible with Samsung Galaxy S6, S6 Edge, S6 Edge Plus, and Note5. But that is just the tip of the iceberg. I am going to expand to other smartphones if there is enough demand.

Challenges I ran into

To explain the challenges I ran into, I will refer to Unity3D specific vocabulary. As a new programmer, programmatically turning each word in the top story's abstract canvas into a collectible (selectable) item made me think a bit harder than usual. I knew that I could use an array of strings but I wasn't sure how to create UIText objects programmatically for each word from the top stories' abstract. The temporary solution was to show all words and their meanings (from the Pearson Dictionaries API) at once, on the northeast view. I also struggled to implement a functional UI, so I decided to show only the basic information (with focus on one topic: Technology) for this proof-of-concept app. That will also change in future iterations.

Accomplishments that I'm proud of

The fact that I got this far in this competition makes me proud. I did not think I could ever understand code (My past experiences learning how to code were not very inspirational) until I decided to buy an Arduino a few months ago, then moved to Android, and now Unity3D with Gear VR. The latter is proving to be very powerful to make educational apps :D

What I learned

The future is here. It is upon each of us to let our creative side join forces with our technical side and make awesome stuff with the groundbreaking technology available to us. It takes time, often years, to make something great, but when it is good enough, oh boy, it is worth it!

What's next for Aha!News

If scientific knowledge from Sci-Fi movies are correct, Aha!News will have its background environment (the color behind the text) generate images and colors that help visual learners and stay focused and learn. VR is a new medium and there is definitely a lot of ground to be explored, specially in education. In terms of the platform, it will eventually be available for Google Cardboard (which is compatible with most Android smartphones) in addition to Gear VR.

I am very excited about the possibilities ahead. VR is going to help us see a much more meaningful and caring world.

Built With

Share this project: