Inspiration
We were inspired by a mix of existing betting applications, including online live dealer blackjack games, PokerStars VR, plus the launch of the Apple Vision Pro with 3D video.
What it does
Demo app showing simulated play for a 1:1 blackjack game with a "live" dealer. Users can play one round of blackjack, including placing a bet, deciding on play (e.g. hit/stand/double down), and experience winning or losing.
How we built it
UI design via Figma. Video via Kandao Stereoscopic 180 degree VR cam. Dev via Unreal Engine 5.
Challenges we ran into
There were a lot of adjustments to getting the camera angles right. We spent significant time ensuring the user experience was comfortable, such as optimizing how far users needed to reach to tap actions or look down at their cards. Lighting was also a challenge—we still feel there's room for improvement, but we've come a long way since we started.
Additionally, we faced the intricate task of imitating depth using stereoscopic 180-degree video. This involved aligning virtual hands to cast realistic shadows onto the real table in the stereo video, enhancing the sense of immersion. Another major challenge was lining up 3D objects and 3D UI to appear as though they were naturally sitting on the table in the stereo video. We engaged in extensive back-and-forth to get the button depth and color just right, ensuring they provided clear affordance for interactivity against the video background.
We ran several streaming tests but some network/lag issues persist. We need to continue tweaking this to pull off the complete vision of the app.
Accomplishments that we're proud of
We were able to complete a closed-loop demo that enables us to test the concept and gather user feedback. The betting mechanism seems to be fun for players to use and makes people want to keep tapping! Simulating a live dealer with 3D video and interactable 3D objects and UI integrated is also pretty cool. We were able to take the two flat images a stereoscopic camera takes and create simulated depth at runtime so virtual objects such as your hand tracked hands would cast shadows on objects in the stereo video, this was something we never thought would be possible.
What we learned
Live streaming applications are HARD! A lot of traditional UX methods still apply when planning an interface for XR, but getting the feeling right takes good collaboration with a dev and a lot of testing/iterating. We learned a lot about stereoscopic technology and the science behind the way the brain perceives depth. Also learning a lot about video compression and optimization at runtime to allow for 7.5k resolution video to run without hitch on the quest 3 headset hardware.
What's next for Luxino
We're sharing the demo with as many people as possible to gather feedback to help guide us where to go next. Our next big milestone is to find a casino partner to run a pilot program with using a live dealer setup. We also believe this technology/application could have a lot of other uses outside of gaming as well (e.g., live 1:1 interactions like VIP celebrity meetups or private therapy sessions).
Built With
- figma
- kandoa-vr
- unreal-engine





Log in or sign up for Devpost to join the conversation.