There is information on cities, housing, and transportation, all freely available to anyone, but seeing numbers and attributes on paper doesn’t quite translate the reality of what someone actually experiences. Inspired by Minecraft, we wanted to create a more personalized experience of city simulation through the use of Lego blocks and Microsoft HoloLens.

What it does

Our project allows users to experience real-time simulation of housing and transportation through use of a Hololens. Different colors and sizes of the LEGO blocks represent various types of zoning, such as commercial, residential, government, and nature. Flow and density of traffic in an area is visualized through the color and scale of drawn magnetic fields. Other properties such as capacity, walkability between vehicles and buildings, and many more attributes can be easily presented and understood. This has direct applications both for governments in the sense of city planning, communities for envisioning future projects, and for individuals looking to find the perfect neighborhood to call home.

How we built it

The scope of our project has 3 main parts: Models: Physical objects in the form of LEGOs and magnets, and digital information, such as residential populations and job opportunities. Views: Mixed reality view from HoloLens. Controllers: Gaze, gesture, and voice as inputs in HoloLens. Press, Push, and Drag as inputs for the magnetic system. Users interact with the physical models by reconfiguring the LEGO blocks and manipulating the magnetic blocks. These actions trigger corresponding holograms and overlay digital information in the Hololens.

Challenges we ran into

For collaboration of a shared hologram with multiple HoloLens, we found it is hard to define the global coordinates, as well as letting player see each other’s interaction in real-time. Initially we Ran spent the entire weekend researching networking (web sockets) between two HoloLens and our physical models (LEGOs and magnets), but due to limited time constraints we decided to instead focus on enhancing an individual user’s interactions (gaze, gesture, and voice) with a HoloLens and the models.

Accomplishments that we're proud of

Poseidon and Riaz developed the software for Unity with the HoloLens. Han-Chih worked on the magnets and Ran was responsible for the LEGO block integration. Bolin created and maintained the entire design guidelines and user experience between bits and atoms.

What we learned

In order to trigger digital information on the physical models with virtual inputs using HoloLens, Poseidon and Riaz came up with a grid system overlaying MetaCity, which allows users to combine Gaze and Air Tap to display information and use Voice for searching a specific building. As a whole, we learned a lot about the limitation and application of transforming LEGO blocks and magnet blocks into tangible bits, allowing governments and citizens to do city simulation with an immersive personal experience.

What's next for MetaCity

First we’re opening all of our source codes, as some of them could be APIs for developers to dive deeper into the applications beyond city simulations. Secondly, we are thinking to make these tools into LEGO and Magnet Developer Kits for kids to create their own cities. The best output of this project might be used by governments, citizens and research labs for collaborative simulation on cities, especially housing and transportation.

Built With

Share this project: