We're inspired by the resilience of hurricane survivors and the courage of search and rescue teams.
Specifically, we were drawn to the story of "X-codes" - also called "Katrina crosses," but officially called "search codes" - that were spray painted on home after home in New Orleans by FEMA's Search and Rescue teams to communicate important information in the wake of Hurricane Katrina. Each quadrant of the X contained letters and numbers that explained the date the home was searched, which team conducted the search, the types of hazards present, and the number of bodies found inside -- alive and dead.
For many in New Orleans, the X's have become controversial and political symbols that tell the tragic story of an entire city. Some people view them as reminders of the government's failure - and the people's will to survive. And others, their trauma and loss. But for all New Orleanians who returned home, the X-codes remain a visceral reminder of Hurricane Katrina's unimaginable toll taken on block-after-block of communities.
Reading their stories, we wanted to create an AR tool that could help first-responders - like FEMA's Search and Rescue Teams - quickly capture geometric data of the damage done to the city's streets, neighborhoods, and homes and share it with displaced survivors. This way, we might be able to improve the quality of data being captured while also reduce survivors' anxiety by allowing them to experience and anticipate what to expect when they return home.
We also imagined other use cases for the technology, including journalism, education, real estate, and tourism.
Here's our video! https://youtu.be/aOhv9TxXIv8
What it does
Homeward uses two Ricoh Thetas (or a single vuze) to rapidly capture the geometry of a space in order to view that space as a 3D model in AR. We developed a novel processing pipeline to generate low-poly 3D meshes from stereo panoramas.
The user takes two panoramas, separated by a certain distance. We built a slider to make this process easier. In the future, this can be replaced by two relatively cheap panoramic cameras (each would cost $80-$200). After which, the user can upload the photos (or video frames!) to the algorithm.
We start with finding the stereo disparity between images using a block matching algorithm (implemented in matlab). From the disparity, we perform filtering to produce a better depth map. Note that the depth map is now in an flat 360 equirectangular projection format.
We fixed erroneous points in the depth map by running a standard deviation filter on one o the color images. The system requres texture to work, so we weight lower areas with low texture.
From there, we could unwrap it to generate a dense point cloud. It was really challenging generating a satisfying mesh through standard poisson surface reconstruction. Instead, we used a method by sampling the vertex indices in the flat equirectangular depth image. Then we found a standard grid for the mesh indices. From there, we transformed the vertex positions. this led to a much more satisfying mesh.
We then displayed it all using Vuforia on Unity on a phone. We implemented pinch to zoom, as well as a left and right button for rotation.
In addition, we implemented a pointer animation where if you click on the screen, we allow a pin-point to fall from the sky. this was fun!
The system is limited when there's not enough texture. We have ideas of solving that by doing segmentation of regions of the image first, or by using deep learning. Stay tuned.
Many of the memers of our team were completely new to unity/coding. At first we tried to use the mapbox api, but found it very complicated, and scrapped showing a terrain. However, team members did learn the basics and how to implement simple yet awesome animations through scripting.
We made it!
Challenges we ran into
We discussed who our end user was - whether first responders or family's wanting to capture the condition of their home.
We critiqued on offline situations.
We critiqued on scales.
We tried to create elevation maps on MapBox but ditched the program (which we loved!).
We tried to incorporate the use of tools.
We tried to set accurate locations in our code.
We tried to generate patterns.
Accomplishments that we're proud of
We were a group that began walking from table to table on Saturday assembling a team and looking for design help, and we got to meet two awesome designers in the process!
We're really proud that it even worked at all. We really weren't sure if we could do it in 48 hours. In fact, we weren't sure this morning we could do it. But this was super dope, and we had a great time meeting, learning, and teaching.
We're proud that we bring our idea out in 48 hours.
We're super proud of the code that was created.
We're proud that our team learned Unity.
We're proud that we built something for AR Good!
What we learned
We learned how to use multiple programs.
We learned how is the potential of the code that we created.
We learned many from other teams.
We learned that everyone in the team is important.
What's next for Homeward
Next up, we're fine-tuning our algorithm, and doing more research into the needs of hurricane victims and search-and-rescue first-responders.