Inspiration
I wanted to provide immersive, tactile exploration of artifacts by making it accessible to anyone with a web browser and a webcam.
What it does
It features a scroll-driven interactive page that educates the user about the artifact. In exploration mode, it uses computer vision to track the user's hands via their webcam, translating real-life hand movements into 3D interactions. This allows users to grab, rotate, and examine detailed 3D models of artifacts using their bare hands.
How we built it
The application was built using SvelteKit. For the 3D rendering pipeline, I utilized Threlte (a Svelte wrapper for Three.js) to compose scenes and manage materials, lighting, and camera paths. Real-time hand tracking was implemented using Google's MediaPipe Vision Tasks API.
Challenges we ran into
One of the hardest parts was mathematically mapping the 2D pixel coordinates and depth estimations from MediaPipe's hand tracking into the 3D world space.
Accomplishments that we're proud of
I managed to create a highly polished, cinematic onboarding experience that feels intentional and expensive.
What we learned
I gained experience with the MediaPipe vision APIs, specifically interpreting hand landmarks and translating them into usable interaction states.
What's next for ZYX
More complex hand gestures such as two-handed pinch-to-zoom, or pull apart gestures to explore an artifact's exploded view could be introduced, along with different interactions for different artifacts.
Built With
- cloudflare
- mediapipe
- svelte
- tailwind
- three.js
Log in or sign up for Devpost to join the conversation.