Inspiration
The main inspiration was the possibilities we imagined when listening to the beginning OwlHacks presentations involving inclusive education/learning and how the hardware applications of VR/AR could be incorporated into different fields of education (medical, physics, architecture fields).
What it does
Based on our way of development, we have one version of the app that allows the user to spawn 3D-modeled objects when ARkit detects a plane. The other version is rendering a large purple angular fish in 3D space.
How we built it
We used Xcode's AR app which was provided with the IDE and used some open-source libraries for the 3D models.
Challenges we ran into
Several members of the team had Xcode that was out of date in addition to our phones not being compatible with Xcode. It took lots of our time just downloading and updating our IDE and iOS versions. Some members did not have a cable to connect their phones to their laptops, and some of the cables for two of our team members would not have a stable connection. Some of the team also was unfamiliar with swift.
Accomplishments that we're proud of
We are proud of the fact that most to all of us were completely new to Xcode and working with AR and we still managed to produce two working prototypes (spawning objects on a plane and rendering custom models)
What we learned
We learn about how to use the fundamental building blocks of ARkit and Xcode, in addition to how an Xcode developer tests applications using a physical iPhone rather than a simulator.
What's next for AR Jawn
The next step is to get art pieces (i.e. the Mona Lisa) as 3D models and add the descriptions in the text boxes displayed on our “plane” branch. Other future features planned are to have 3D models of physical problems, and live closed captioning.
Log in or sign up for Devpost to join the conversation.