Brain computer interface, physical computing, connection and relationships, different energies coming together, blockchain, installation design.

What it does

Affects digital clothing and audio-visual content based on user location (proximity) and conversation emotion derived from speech-audio inputs. Some of the experiential effects include color changes, sharing (translate clothing objects), particle system generation, and audio.

How we built it

We started by grounding our creative vision with rough UX deliverables like feature set, experience map and user flow. Next, we got into development by setting up our git repo, Unity environment (Lumin SDK, MRTK) and creating digital assets (ie. 3D models from Blender, images and sounds).


Multi-user set up Capturing emotion or sentiment analysis from EEG device Exporting particle system animations from blender Designing around Magic Leap constraints (FOV, battery life)

Accomplishments that we're proud of

Establishing great creative vision and a path forward. We were able to prototype basic functionality and prove our concept in a short amount of time (while gaining new skills).

What we learned

Unity development, Blender modeling and rigging, GitHub workflow, SDK integration.

What's next for Cambridge Fashion House

We'd like to clean up our proof of concept and pitch our idea to brands.

Built With

Share this project: