ARtistry - From Sketch to Space
Inspiration
We were inspired by the gap between creative ideas and immersive technology. Many artists and designers have 2D sketches but no easy way to convert them into interactive AR experiences. Our goal was to empower creators to quickly bring their concepts to life in 3D and visualize them in real-world space.
What It Does
- Draw a 2D image
- Convert it into a 3D GLB model via our backend
- View and interact with the model in real-time AR In short: Image (2D) → API → GLB (3D) → AR
How We Built It
- Frontend: React + Tailwind CSS for a clean UI
- Backend: FastAPI for file handling and model inference
- Model: Hugging Face
stabilityai/stable-fast-3dfor 3D generation - Output: GLB models rendered in-browser using WebXR
Challenges
- Integrating a 2D → 3D model pipeline with acceptable latency
- Debugging API endpoints and CORS issues
- Achieving realistic AR rendering and proper model scaling
Accomplishments
- Functional end-to-end image → 3D → AR pipeline
- Responsive and intuitive user interface
- Real-time AR visualization within the hackathon timeframe
What We Learned
- Practical integration of AI 3D generation models
- The importance of seamless UX in AR products
- Team collaboration under time pressure
What's Next
- Optimize computation time for faster inference
- Improve mobile AR responsiveness
- Enable user collaboration and sharing
- Build e-commerce plugins for AR product previews
Log in or sign up for Devpost to join the conversation.