Our team member Hamza identified a problem with the tools that people who work with creating 3d environments face: getting a full texture package from a website is often locked behind a paywall or simply unavailable. The only thing that is readily available is a diffuse map, which does not look up to 2023 standards of 3D design. So the idea of a ML algorithm that generates Normal, Displacement and Roughness maps came to be.

What it does

Our tool allows the developer to convert their texture into the maps required using a superior esrgan model, and displays the texture with its respective shader applied, allowing for quick prototyping using threeJS and our own GUI to manipulate the scene.

How we built it

We used joeyballentine's Material-Map-Generator to create the heart of the server, converting a color image into 3 types of maps: displacement, normal, and roughness. To serve the files, we used a flask server, which gives the map files over to the client. The client uses threeJS to render the image and the depth shader applied to it, and allows for fine tuning of lighting and other shader parameters.

Challenges we ran into

We had to learn a lot of technologies. A few of us had to brush up on our python, learn flask, learn threeJS, and learn how to use the pretrained model to generate maps. Most of the challenges came from having the server send the converted image to the client and integrating the client and server in general.

Accomplishments that we're proud of

We are proud of the understanding of threeJS as well as the implementation of it into our tool in such a short period of time. One other thing that we are proud of is the creation of our server and how easy it is to use for any client.

What we learned

Among the many technologies we learnt about, we all gained a deeper understanding about machine learning and what it takes to test a model, search, scour for effective datasets, and generate desired output. We have also learned how to setup servers with Flask and simple API testing with POSTMAN.

What's next for Depth^2

We look forward to continue working on Depth^2, implementing new features, such as specular maps, refining the user experience on the front-end, and most importantly, fine tuning the model with more datasets for more predictable, accurate and desirable results.

Built With

Share this project: