sustainability doesn't only apply on physical production only as we all know AR can help Fashion Industry a lot through its capabilities, but what if we could reduce Well-to-wheels (WtW) by shopping more wisely without needing to change or return an item by trying it before shopping items online? we can reduce transport and its impact on the climate through greenhouse gas emissions if we shop more wisely by viewing , trying and matching items before adding them to our online basket.
What it does
This digital wardrobe contains different garment pieces and accessories and Snapchatter can navigate between items by showing the palm of the right or left hand, I've tried to fit as many as Try on experiences to complete a full outlook including 2 style of bags that each appears on wrist of the left or right hand.
How we built it
All garments are initially made with Clo3d designing app, optimized and prepared for Lens studio (e.x. added vertex paint for flow simulation or reduced vertices ) by blender, try on template is used as it makes choosing between items super easy for the snapchat user and added some VFX to show which items are suitable for rainy or snowy weather.
Challenges we ran into
optimization the size of this Lens below 5 MB for a better performance since I unfortunately ran into problems using Remote Assets.
Accomplishments that we're proud of
This project is a good example of using variant trackers and many of Snap AR capabilities along with cloth simulations at once in a single project without losing the performance quality and speed on different devices by keeping the size under 5MB.
What we learned
how to use custom components and learning about Remote Assets
What's next for Digital OOTD
sharing API could be a good solution for users to communicate with friends and family instantly to choose items to shop and try on themselves or help others choosing a nice outfit through AR without needing to be physically present.