Our background is in neuroscience research. As part of our work, we generate three-dimensional reconstructions of patients' brain geometry, computed from spatial images. One essential—but extremely tedious—step in this pipeline involves manually removing the patient's cerebellum from the volumetric data one slice at a time, a process that takes upwards of 6 hours to complete.
After playing around with Tilt Brush—a VR "volumetric painting" app—on a collaborator's HTC Vive, we realized that, if we could intuitively "erase" MRI geometry in VR, we could cut that time drastically enough to justify buying a Vive for our own (totally science-related) purposes.
But, more than that, we realized we were excited just to see what spatial images actually look like. To break away from coronal, sagittal, and axial. To step inside a brain the size of a Volkswagen.
What it does
radroom turns medical imaging data into room-sized virtual holograms you can interact with.
How we built it
We use Python libraries (NumPy, PIL, NiBabel) to convert imaging data stored in standard formats (DICOM, NIfTI) to texture data suitable for UnrealEngine 4. This texture data is then fed into a material shader for a system of semi-transparent particles scattered throughout the workspace; this custom material "paints" the particles with the appropriate color and opacity by interpolating the underlying spatial image, creating a volumetric hologram effect.
Challenges we ran into
- Getting UnrealEngine code to link properly with external libraries can be nontrivial.
- Constructing UE4 Blueprint scripts requires skills orthogonal to programming.
Accomplishments that we're proud of
It actually runs.
What we learned
GTX1080 or gtfo.
What's next for radroom
- Full DICOM client/server integration (to work seamlessly with existing radiology / research datasets)
- Suite of tools for user interaction with the data (surgical planning, "MRI paint", clip volumes, ...)
- Higher order data types (connectivity, time series, multiple co-registered modalities, ...)