Our creator community's growing creativity and desire for self-expression drive us. We wanted to empower the community with the ability to create their own unique style and identity through the most fashionable and creative means possible.
What it does
The Lens offers 2 custom fashion options- a puffer jacket and cropped sweater. Users can personalize their selection by uploading an image or video, adjusting sizes and rotation. The lens also has a "draw" option, which allows users to paint over the selected media or use it as a blank canvas. The drawings can be saved and loaded for future sessions even after closing the app. Users can also try on their creation virtually by switching the camera.
How we built it
First, we set off to implement media projection on the fashion pieces. We wanted to allow users to move while wearing the projected clothing which pushed us to use tri-planar projection. Secondly, we wanted users to be able to draw on clothing. The drawing should be seamless (no visible seams around UV edges), the user has to be able to wear the drawn-on clothing (use UV positions instead of world positions) and the user has to be able to save their work. To prevent visible UV seams, we set out to project the entire drawing from the camera to the mesh, instead of only projecting the position the user is touching. This gave a more seamless and premium drawing experience, as the user could draw on the object as if they're drawing on paper. By projection mapping, we were also able to draw in UV space, instead of world position space. Even though this made the lens 100 times more complex to develop, this allows for the freedom of mobility while trying on the fashion piece. Last but least, we wanted users to be able to save their creative expressions. Because we cannot save texture while using the lens cloud API, we came up with a solution to push the users' brush strokes to the cloud as JSON data. When the user wants to retrieve their creation, they simply click a button, that downloads the JSON data from the cloud, and replicates the steps the user has taken previously, all while allowing the user to see their creation being redrawn in real time.
Challenges we ran into
The biggest challenge was mapping the screen UV to the mesh in order to draw on it. This required translating touch screen inputs into the 3D mesh, using screen position and matrix multiplication. To achieve this, we had to rewrite the Snapchat engine, recreating much of its functionality, learning a lot in the process. The jacket was animated and could move, making it even more difficult to draw on. To store the user's drawings, we had to save the steps to JSON cloud data and retrieve it instead of using a texture. This decision was made to give the user more control and to increase the potential impact and future possibilities of integration.
Accomplishments that we're proud of
We are super proud we made it work in a way that the experience is user friendly and accessible although the tech behind the Lens is quite technical.
What we learned
From matrix multiplications to logarithmic depth calculations, our creative approaches forced us to gain a better understanding of the inner workings of 3D engines. We also learned how to translate these highly technical feats, to simple to understand and user friendly AR experiences.
What's next for Virtual Stitch
The next step would be adding more fashion items to the Lens using Lens Cloud Storage giving user more choice in what fits their style and combining several items to create a look. We also would like to add more customization options to the Lens like:
- Textile customization: Users can select different fabric textures or patterns for their designs.
- Embellishment customization: Users can add different embellishments such as buttons, studs, or patches to their designs.
- Text customization: Users can add text or monograms to their designs.
A potential future feature is to enable users to bring their digital creations to life by having the ability to create a physical version of their item which they can wear in real life. This would require connecting the Lens to an external database, which may not be feasible in the short term.
Log in or sign up for Devpost to join the conversation.