Inspiration
I love exploring different artifacts in museums. It's a great way to learn how people did things in the past and how that has changed over time. I thought it would be interesting to use AR to bring these artifacts into life and place them beside everyday objects. In this way, people can see how everyday items evolve over time and learn about culture and history in a fun and intuitive way.
What it does
By scanning an object in their home, users can explore what the object was like in the past and learn the story behind it. In this Lens, first there will be a hint that tells the users to look for a mug, bottle, phone, or shoe. When they find one of the objects and tap the screen, the related historical artifact and its description will appear. Users will be able to drag and place the artifact on any surface, as well as rotating and scaling it. After they finish exploring the current item and turn the phone away, the contents will disappear for the next scan.
How I built it
I used Lens Studio's Scan template as a basis and gradually built the Lens upon it. The basic idea is to show the interactive 3d model and text when a certain item is recognized by the machine learning model. What I did was sending custom trigger messages through script to enable different scene objects based on scanned results. The 3D assets are from Sketchfab's public domain 3D collections dedicated by different cultural organizations. I used surface tracking together with the manipulation component to add interactivity to objects.
Challenges I ran into
This is the first time I tried scripting in Lens Studio. I went through a lot of documentations to learn how to create custom interactions by sending triggers, calling APIs, etc. It's also my first time building my own user interface. It took me a while to figure out how to make the interactive text slider.
Accomplishments that I'm proud of
I'm proud that I made this Lens by myself! I'm super happy with the result.
What I learned
From using the scan module to making user interfaces, I really learned a lot from this project. And I'm now familiar with the entire workflow of making AR lenses.
What's next for Objects in Time
The next step is to make there more objects for users to scan and explore.
Built With
- javascript
- lensstudio
Log in or sign up for Devpost to join the conversation.