Presentation link: https://docs.google.com/presentation/d/1VJEtBQl8IU3Gi1CfSNWlFlnEn5C25C-JibAyXKrqo8U/edit?usp=sharing

Inspiration

We were inspired by the large format, socially conscious, tech-based art installations at the Light City Baltimore festival to create our own project in that vein, something that could be theoretically set up in a demonstration space and be open for everyone to try out. Our goal was to present data that illustrates the inequities and issues in our city in a visually interesting and effective way that people without much experience with data visualization could understand and interact with. Although it's sad to not be in-person at Technica this year, working from home allowed us to utilize full roomscale virtual reality and use the controller input.

What it does

Users can switch between two data visualizations, average tree cover by neighborhood and median household income by neighborhood, interact with a menu using VR controller input, and shift between multiple viewpoints so they can see the visualization from different angles.

How we built it

We used the Unity game engine to build the experience. The VR headset we built it for was the HTC Vive with the HTC Vive controllers.

Unity packages we used were the Unity Mapbox SDK, which let us import custom maps we made with Mapbox, and the Unity XR Interaction Toolkit, which let us make the experience in virtual reality and manage head-mounted display tracking and controller input.

The Mapbox SDK allows us to use access tokens to connect our Unity project directly to the datasets and tilesets we constructed on the Mapbox web tool. We manually made a custom tileset where each Baltimore neighborhood is represented by a polygon. We could then associate custom data fields with each neighborhood. For Technica, we added average tree cover and median household income, and since we made the custom tileset, with more time we can continue to add more datasets as long as each neighborhood has a data entry.

Challenges we ran into

Early on, there were a lot of issues with opening tilesets in Unity and making them render properly. We were also unable to find free data for neighborhood boundaries, and so had to manually draw out each region.

Another challenge was that Mapbox, the mapping tool we used to make datasets, always splits neighborhoods across a tiled grid. So for instance a single neighborhood that geographically spans 3 in-engine tiles would be split into 3 different meshes (what make the graphics show up). And since the meshes are generated when you press play and no earlier, it was really difficult to group the disparate neighborhood chunks together into the entire neighborhood region. Bella tried to associate them by matching the height of the mesh bounds, because that's based on the value of the data, here either tree cover or median income, but ran out of time. We tried to manually add text for each neighborhood's name, but there also wasn't enough time to do that and get the text to rotate to face the user as they moved to different viewpoints.

Accomplishments that we're proud of

We're both really proud of creating a way to present data about our hometown in an interesting and informative way.

Bella: Getting the VR interfaces to be interactive and getting the meshes to render properly

Talia: Using the Mapbox dataset creation studio to manually map out over 50 neighborhoods

What we learned

Talia: I learned about various websites with huge amounts of data about Baltimore and Maryland that I hope to be able to use in the future. Before this, I had never worked with Mapbox or with datasets or maps at all, so I definitely learned a lot in that area.

Bella: I learned a lot about importing maps into Unity, and working with mesh rendering and materials (textures and colors). I also learned how to make an interactive VR user interface, which I hadn't known before.

What's next for Baltimore Virtual Reality Data Visualization Experience

We're hoping that users will be able to toggle between different display colors, move freely around the VR data landscape, and point and click on each neighborhood to view statistics about it. Being able to interact with each neighborhood would require solving the programming challenge mentioned earlier about the meshes covering multiple tiles in the tileset.

The neighborhood meshes would look better with a color gradient based on the data value, such as light green for a dense tree canopy and lighter green for less coverage, and this would help distinguish more easily between regions. Additionally, there are some graphical bugs to fix. Currently, if a neighborhood has no datapoint for a view, the mesh flickers and looks broken.

We also look forward to adding more datasets in addition to tree canopy coverage and median household income.

Built With

Share this project:

Updates