We noticed that people often struggled with two dimensional learning in the past, and so we were inspired to make an alternative that was easier and more intuitive to use.
What it does
The app displays a variety of 3D models, from elements to spacecraft. There are three main functions to the app. First, it can display models in augmented reality a person can walk around and look at with a phone. The app can also concentrate on chemistry, and show examples of common elements, as well as a 3D periodic table. Lastly, the app implements a machine learning model to detect certain common objects, and displays models of the components making up that object. The user can s
How we built it
We built the app with Swift, using several frameworks for ARKit and CoreML, notably, a trained model called MobileNet. To lay out most of the elements, we used storyboards. Lastly, for the models, we used OnShape and SolidWorks to create and modify the models that we included in our project.
Challenges we ran into
There were several compatibility issues between files, as well as bugginess in general, that ate up a lot of time. ARKit and CoreML are very new technology relative to most other tools, and as a result, the product tends to be much more buggy. We also have never used Augmented Reality, SceneKits, or machine learning before, and had to learn it at the hackathon.
Accomplishments that we're proud of
Getting a product with all the elements we set out to have and more was extremely satisfying. During the process, we focused on the Augmented Reality aspect of the app. Because we managed to accomplish this so quickly and effectively, we were able to continue on, and incorporate several other functions into the app, such as the machine learning functionality.
What we learned
We learned a lot about how to use ARKit, CoreML, and SceneKit.
What's next for leARn
In the future, we hope to flesh out our library of models- finishing the database of elements and expanding into more objects and materials with our CoreML functions.