Call for Code Challenge organized by IBM-Watson. Encourages to create solutions that significantly imiprove preparedness for natural disasters and relief when they hit in order to safeguard the health and well-being of communities.

What it does

Builds an Augmented Reality 3D model of the affected area based on the video input by the user. Then uses AI to calculate damage cost and if its safe to move in or not.

How we built it

We use magic leap as an end to end solution. The 3D model is built in Unity and the video is taken in through the cameras on the magic leap. The processing is done onboard and the 3D model is re-rendered in Unity.

Challenges we ran into

Re-rendering the 3D model in Unity. Communicating information about damage costs between IBM-Watson's API and Unity.

Accomplishments that we're proud of


What we learned

Textures can be tricky to work with.

What's next for Magic Aid

Would like to create a more fully fledged app that integrates eye tracking.

Share this project: