Inspiration

This coming September marks the 20th Anniversary of the tragic events of 9/11 at the World Trade Center. At the memorial site of the former twin towers, the two largest man-made waterfalls denote the footprints of the lost buildings. Along the rectangular perimeters of each of the waterfalls are 76 panels that list victims' names. Recent advances in augmented reality tech, live OCR, lidar capture, digital twinning, and XR-cloud streaming capabilities present an opportunity to enhance the experience for visitors to the memorial site.

What it does

The proposed WTC Memorial App will allow visitors to scan any of the memorial lists of names to determine the exact panel (listed N1-N76 for the North Tower and S1-S76 for the South Tower) using an OCR cloud service like AWS Rekognition. The app demo uses Tesseract to extract the text of the panel. After the panel number is determined then a Unity Asset bundle/AR image targets database for that specific panel can be loaded to create interactive AR hotspots for the panel. Touching each of the names will display individual photographs and possible bios. This information is currently available inside the official 911 WTC memorial website. Determining the exact panel number would also determine the positioning of the user relative to the actual site. The 152 panels would be potential AR location anchors to position and view AR replicas of the original twin towers to float over the waterfalls. Using XR streaming services like ISAR and 5G Edge computing will allow for high res digital twin versions of the towers to be rendered in the cloud and streamed through the mobile device.

How we built it

The demo app is developed using the Unity platform (v2019.4.f12) we leveraged the SDK plugins for OCR, AR Foundation, Vuforia and ISAR. The demo unit tests three specific features at the WTC site location.

Challenges we ran into

The primary challenge was not having access to 5G device which we tried to mitigate with the NOVA services to stream an ISAR instance of XR stream hosted inside AWS EC2. Since the proposed app is a location-based experience, having better access to 5G equipment and services will be necessary. Access to the site was limited for testing and asset creation but was valuable to understand the amount of effort needed to photograph create the assets for each memorial panel.

Accomplishments that we're proud of

Creating a demo app that we were able to validate the features and functions of the proposed WTC memorial app. Getting the OCR to recognize the names. Seeing the AR Image Targets recognized on the memorial panel of names. Seeing the AR replica of the Twin towers positioned over the waterfalls where they once stood to get a sense of the immense scale of the original buildings at the current site.

What we learned

Trying to create an AR experience outdoors will require capturing image target assets created under different lighting conditions to ensure better AR visual recognition. We tested both Vuforia and Wikitude plugins at the memorial site. Vuforia was by far the more reliable AR target locks even with varying weather/lights. AR drift and getting correct location bearing will be the most difficult challenge, it will require higher-quality scans of the site and low latency 5G edge computing to create more visually stable representations.

What's next for WTC Memorial App

Definitely more on-site testing, asset creations, and app tweaking. Getting access to 5G hardware will be necessary for future development. Also, trying to get access to help fund and sponsor this WTC memorial initiative will be a priority.

Built With

Share this project:

Updates