Dress Runway addresses the problem of online shoppers who are not convinced by just photos and videos shown in the website. We aim to let shoppers have a better feel of how their dress would look in their everyday environment.
What it does
This AR project is a marketing solution for fashion brands using Spark AR in instagram. It lets users see a three-dimensional render of a dress they choose in the AR filter gallery, and the user would see how the dress would look like in their everyday environment.
First the user needs to decide where they want to place the model by tapping. The dressed model then appears on the tapped area.
Next, the user has a choice to change the color of the dress they want to see through the UI picker. He or she can also change the pose of the model by tap-and-hold on the model. This would let them see how the dress would look like in different poses.
The rendered dress has a realistic looking texture thanks to the normal map placed on the dress material. The user can adjust the lighting cast on the model through the UI slider so that they can match the model's brightness to the actual room brightness, making the dress look more immersed.
How we built it
We used Marvelous Designer to create the dressed avatar, and lowered its polygon count using Blender. The 3D model itself is reduced to around 1.2 megabytes, which is less than half of the maximum limit of 4 megabytes. The 3D model is then uploaded to Mixamo to be automatically rigged and we downloaded several animations from there to display in SparkAR.
In SparkAR Studio, we used both scripting and patch editor to implement the user interface, controls, and animations.
- Color done with script.
- Patch editor: animation selection with long hold, object movement, brightness control, using particle effect and animation for 3D transitions.
- Created animated floor with white dots using shader control in patch editor.
Challenges we ran into
- Making a dressed 3D character that is minimized enough to fit a 4MB size. Not only the learning curve, is important to keep the level of detail and aesthetics high enough.
- Strategizing the best approach for code aspects across patch editor and script. With each new feature to implement comes the question of where and how to code it? Experimenting with how every feature can be done in both paradigms, and thinking about how they could be seamlessly integrated, were unfamiliar challenges.
Accomplishments we're proud of
- As the leader, I am most proud of being able to break down one big abstract goal into concrete small tasks. This allows splitting work easier to my teammates, and it gives me motivation to strive for several short term goals instead of one big goal.
What we learned
- We learned that creating and sharing AR effects in SparkAR is very quick and user friendly. We learned the possibility for AR artists to reach out hundreds of millions of people through this platform.
What is next
The current Dress Runway addresses the user's initial problem of only having a 2D photo as her only reference. The following can be done to improve the user experience.
- Realize a better lighting for the dress. We can make use of the camera texture to extract major colors and adjust ambient lighting, and point lights for the dress to have a more physically accurate color to reality.
- Refining the 3D model. With a better 3D artist, we can get a more realistic 3D dress that is still size efficient. The current dress is still weight painted to a human rig, but loose cloths should be rigged separately to look more realistic.
Another problem for online shoppers is size measurement. While we are not sure of the technical capabilities of the current SparkAR, we do believe that AR has the possibility of granting these features.
- Automatically measure a person's main measurements to check if a dress fits. This includes bust, waist, hip sizes, and also height.
- Augmenting a person with a 3D dress. This involves matching a rigged dress to a detected human pose and size.