3D scanning targets
Neel 3D scanning our targets
Matthew hard at work
Matthew and Neel collaborating on Unity
Scanning 3D in Vuforia
Liquid Pouring - CHEMICAL REACTION
Shyam Sai Comp Sci
In high school, we weren't fortunate enough to attend a school with the equipment required for a science labs. This led us to be under-prepared for college laboratory classes. Many students in the US, and around the world, face a similar situation. The lack of laboratory equipment can seriously deter a budding Einstein. We decided to apply augmented reality technologies to solving this problem. By creating an artificial environment in which to complete these labs, we aim to provide a lab-based science education to students worldwide.
What it does
Lab.me is an augmented reality system that allows users to convert ordinary objects into tools used for scientific inquiry. These tools can be rotated and manipulated in 3D space to create interactive experience for the user. These systems can range from a simple model of the brain to a full science class experiment. We use voice enabled commands to perform different actions within the Lab.me environment. Lab.me can recognize what reactions should occur when two objects collide, powered by a database of chemical compounds and their reactions.
How we built it
We started by doing extensive research into the kinds of software used in commercial applications of augmented reality. We quickly learned that the Vuforia engine was the de facto standard of the industry, and better yet, could be easily integrated within the Unity engine, which we had experience in. Through this engine, we were able to map virtual objects to both 2D images and 3D scans. We did this by first scanning the physical models using Vuforia Object Scanner. We uploaded these scans and cross-checked the compatibility of the models using Vuforia Developer Cloud. We then uploaded the physicals models to Unity, and mapped them to their virtual counterparts through a parent-child relationship. Once we had the models loaded, we coded C# scripts to create interactions between objects within labs. We stored chemical reaction data within an Azure SQL Database, and also utilized Azure Language Understanding (LUIS) and Azure Speech Recognition to control different parts of the lab using vocal commands. Using a combination of all these technologies, we created a series of powerful interactive labs for use anywhere.
Challenges we ran into
We initially struggled in creating the AR targets for our models. We first attempted to create VuMark targets using Adobe Illustrator. However, this was deemed unfeasible given the time we had and our lack of expertise in Illustrator. Next, we attempted to use single images to act as targets for our AR objects. This worked for well for static objects, but was sub-optimal for movement in a 3D plane. To solve, we moved on to 3D objects. We also struggled to simulate certain experiments, as our hardware did not allow for high processing power. For example, liquids render as many particles, which are computationally hard to display on a computer. In particular, getting the source of a liquid from a dynamic AR object was a difficult task, though we ended up solving this problem.
Accomplishments that we're proud of
We're proud, that after 13 hours, we finally were able to implement liquids, probably our biggest accomplishment. In addition, we're proud that we were able to adapt to the different limitations that our software had, all while sticking to and fulfilling our original goal. We're proud that we had fun, no matter the result.
What we learned
We successfully learned how to use Vuforia, having no experience in AR previously. We also successfully implemented the Vuforia Engine in Unity, and were able to quickly learn the ins and outs of Unity in order to suit our tasks.
What's next for Lab.me
We want to build an Android/iOS mobile app that we can use to more easily take advantage of Lab.me while toning down the use of hardware. We also want to develop a set of more diverse and user-friendly labs with more features and wider accessibility. We want to push for better physics and graphics, while also keeping the overhead of Lab.me low. In the long run, we hope that one day we can implement Lab.me in underprivileged and impoverished communities around the world.
List of prizes we're eligible for
Microsoft Azure Champ Prize Facebook Social Good Prize Bloomberg Best Educational Hack Google Best Accessibility Hack