Inspiration
Many people across the world are affected by floods. Even one of our team members had a major house flood. Being able to better predict their effects will help architects design safer buildings and governments plan better evacuation efforts. We also allow this to be done without expensive 3D scanning equipment, so the technology is democratized.
What it does
Our project uses machine learning and physics simulations to demonstrate what would happen if an area were flooded. It takes a 2-dimensional video with IMU data as input to the NeuralRecon machine learning model, which produces a 3D mesh in a .ply file. This model was trained on the ScanNet dataset.
How we built it
We used Neural Recon in order to generate a ply model of the area. We also created a script to normalize and import the model to Blender, where the fluid dynamics are simulated.
Challenges we ran into
The components we used had a lot of dependency problems. In fact, we were not able to get the code to run on the very newest hardware. Instead, we are using an older machine to produce the mesh, because that is what the NeuralRecon component supports.
Accomplishments that we're proud of
We were able to build in a very high degree of automation to the simulation and post-processing of the 3D data. Only a couple of user commands need to be run to initiate the process.
What we learned
We learned how to modify more advanced application configuration settings in XCode, and how to write scripts that automate Blender fluid simulation functions.
What's next for TsunamAI
The model could be adapted to use the Android AR frameworks to enable better cross-platform support. Also, the scan resolution can be increased so the results are more accurate. This simulation can also be used for smoke.
Log in or sign up for Devpost to join the conversation.