Both teammates, studying math, went into HTN with the intention to work on a hack that involved computer graphics and numerical solutions to PDEs. As hacking began, we became interested in working with VR and making a non-trivial application of Firebase. The networked nature of the hack was ossified as neither of us could support Oculus on our laptops - we worked with Google Cardboard, which decreased the freedom of interaction for headset wearers. This naturally lead into a key technical problem we solved - that of having multiple sessions present a synchronized simulation to many users of the application.

What it does

The demo solves the wave equation on a cell numerically in real time, using a discrete Euler-Lagrange method. The wave can be accessed online by anybody with a phone or laptop, and is synchronized between all users with Firebase. Users can tap or click on any area of the water to generate new wavelets that others can see with minimal latency.

Additionally, users on mobile can experience the wave in virtual reality with Google cardboard! Once they have a $25 headset, they can watch waves pop out at them generated by their fellow hackers.

How we built it

The wave is given by the partial differential equation u_tt = c^2 u_xx^2. We also include a 'damping term' to make the simulation more realistic.

The graphics are built on top of three.js, a javascript API library for WebGL. The wave is made of three.js primitives, a rectangular prism and a deformable plane. Given an initial surface, we generate each frame of the animation by approximating the surface by its tangent plane (Euler-Lagrange). When a user taps on the simulation, the 2D coordinates are converted into a 3D position in the simulation, and a new wave (gaussian) forms on top of the surface.

Numerical simulation is inherently unstable, so we needed to periodically sync between all users on the network.

When a new user comes on, we periodically transmit their water's position and velocity data to all other users across the network via Firebase. The wave equation is linear, meaning that we can simply add the position and velocity information to the simulation once it is received.

For the VR, we used Google cardboard, a cheap and accessible device. To create the effect of 3D, we split the screen into two images. We also make use of the phone's gyroscope. To this end we were greatly helped by existing Cardboard demos available online.

Lighting and other effects were handled by three.js. We based some of our code (in particular the numerical methods) on the wave simulation here.

Challenges we ran into

  • Syncing between users is an issue, particularly when scaling up the number of users.
  • Performance! Graphics is by nature very demanding. We kept our simulation to a small grid with relatively low resolution to keep it working across all devices, and across the network.
  • API issues: Google Cardboard APIs have very sparse documentation.
  • Compatibility issues: Many of the bugs we ran into applied to certain versions of three.js / Cardboard API.

Accomplishments that we're proud of

  • Drops in latency from 120ms to 4.8ms for data sync (method: send data as string over network instead of as data tree.)
  • PDE simulation from a relatively low level -- no game engines here!

What we learned

  • In graphics, you can get great results with simple methods. The bottleneck is in implementation.
  • Check to make sure you've pushed when you don't see any changes in the simulation!
  • Commit early, commit (much more) often.
  • Git reflog and reset --hard are lifesavers.
  • The gyroscopes on mobile devices and tablets are able to generate Events every time they are moved.

What's next for Resonant Virtual Wave

Bigger, faster and more powerful simulations (next time in parallel, on OpenGL/CUDA). The techniques can also be used for deep learning and possibly visualization of those techniques.

Share this project: