Tired of 2D photos for visualization and decision making

What it does

Stitches 2D photos of an object into 3D model which is combined into a interactive augmented reality for critical visualization and decision making for users in remote area

Client(Seller) iOS application send a group of photos of an object to server

In server the photos are modeled into a 3d Object

3d Object is received by Client(buyer) and shown in interactive AR environment

3d objects from different sellers can be viewed simultaneously in single simulation

3D objects can be scaled and moved around and analyzed in 3d View

The app set's platform for buyer and seller via interactive 3D platform.

How we built it

We build client mobile iOS app in Swift and interactive augmented models were built using RealityKit and ARKit

The 2D to 3D modelling photogrammetry was done through metal graphical Api apple.

Google Cloud Platform and python server are used for interaction between mobile client and server for image and file transfer

These GCP Api's serve and store images to bucket storage and from Photogrammetry Engine.

Challenges we ran into

Running Graphics Intensive Computation of 2D images into 3D models

Creating Interactive Augmented 3D object placement

Deploying Server for Graphical computation

Accomplishments that we're proud of

Creating look alike 3d Model and from group of 2d photos

Deploying Several 3D models into one simulation

Deploying server and client interaction in short time

What we learned

3D modelling, AR Creation and manipulation, Time Management and Team working

What's next for RealAR

Real time Interaction between two end users at remote location in augmented environment

Built With

Share this project: