Inspiration

Droppod was actually inspired by the challenge this virtual hackathon provided. We are a group of four and two of us live close enough to each other so that we could be physically together. However, the other two don't. We thought it would be very cool if we could find a way to bring them to our space in real-time.

What it does

Droppod allows a person to teleport themselves from one space to another. This is built to serve as a proof of concept for AR video chat. Imagine you have a friend who lives 200 miles from you. Droppod allows you to see your friend in your living room in real-time!

How we built it

There are two main components of Droppod: traveler and receiver. The traveler is able to send their body and the receiver shows the body in their own environment in real-time. We were able to use machine learning libraries, tensorflow and bodypix, to segment the traveler's body from their space. We then continuously delivered that body segment into our AR space using Aframe.js and Ar.js.

Challenges we ran into

The biggest challenge we ran too was at the beginner. At first, we were trying to reconstruct a 3d model of a person so that we can show that in the AR space. However, that proved to be impractical for the short time we had. So we had to resort to showing a curved image. Another challenge we ran into was the segmentation. We had to make sure that it algorithm segmented enough of the body but not too much. Finding the balance was a challenge

Accomplishments that we're proud of

We are proud that we were able to continue with our idea until the end. Although we had to lessen the requirements a bit due to the time constraints, we have a demo!

What we learned

We learned a lot about AR, frames, body segmentation, and UI!

What's next for Droppod

We want to work on reconstructing a live 3d model of a person so that they look more real in the projected space!

Built With

+ 4 more
Share this project:

Updates