Inspiration

collaborative drawing app, and turn it to a 3d model

What it does

changes 2d drawings into a 3d model representation without training.

How we built it

using react and flask. user draws on canvas (using fabric.js) that is forwarded to ControlNet or Midas Depth estimator a depth map is generated, and that is fowarded to a 3d render (three.js) is generated on user side

Challenges we ran into

ControlNet is very computation heavy. Kamiak jobs were not running.

Accomplishments that we're proud of

Found Midas, less computation heavy option. Transform png to a .ply 3d model.

What we learned

What's next for Crimson Cowboys

use Meta's SAM to detect objects so that the 3d render is better and can be on different objects on the same picture. (being able to manipulate a single object in a canvas)

Built With

Share this project:

Updates