MimeSound

Inspiration

After seeing people utilize depth sensing technologies like the Kinect in innovative ways, we felt we could integrate this with the Oculus Rift to create a truly immersive experience. We, as many people do, wanted to integrate new technology with our passions. Our passion: music. We wanted to give people the fun, creative experience of playing a musical instrument without requiring a physical instrument.

What it does

MimeSound pairs virtual reality with motion sensing technology to give the user a truly immersive experience. MimeSound comes with an array of different instruments, easy to play, and fun to play with. The program contains a string-synth, bells, a lazer-harp, a drum kit, and a dubstep -thingy. Controls are simple and intuitive. Touch things with your hands.

How We built it

For hardware, we used a Microsoft Kinect for Xbox 360 paired with the Oculus Rift Development Kit 2. For real-time-rendering we used Unity 5. We connected Unity, through nerds.de's LoopBe1 internal midi port to Reaper, a digital audio workstation, using Keijiro Takahashi's Unity midi bridge asset. When a user triggers a sound, a midi signal is sent to Reaper which implements the sounds using a variety of VST's. We did all of our Unity scripting in Visual Studio 2015, and all of our modeling in Autodesk Maya 2014 and Blender 3d. In order to link the Kinect with Unity 5, we used RF Solution's Kinect with MS-SDK Unity asset.

Challenges We ran into

We had to manage many pieces of software in order to implement our project fully. We originally had the idea to build a theremin. We ran into issues with the virtual midi port, and decided, in the interest of time, to scrap it, and to implement the bells in its place. Another challenge that we faced during development was motion sickness. A lot of tweaks and settings were hard to get just right, so that not many people would get sick from feeling like they were moving without actually moving.

Accomplishments that We're proud of

We are proud of the way that everyone in the group contributed their own, unique talents to the project. Paul Biermann, used his musical knowledge in order to choose the scales and to build the sounds in the game using Reaper and VST's. Robert Young contributed his practiced knowledge of the C# language, coding most of the program. Nick Benge contributed to the group by leading and by navigating the interfaces of Unity, as well as some coding and modeling. Nick Pecka contributed to the group by modeling all of the objects in the program. We are also proud of creating a polished product within the time constraints. We had a functional prototype for one instrument by the end of the first night.

What We learned

For most of us, this was our first hackathon. We learned what hackathons are like. Among other things, we all learned how to connect 2 applications with MIDI, how to use Reaper and VSTs to customize sounds, Unity doesn't actually have support for mesh colliders, among other things.

Sleep is good. Sleep is love; Sleep is life.

Sleep is for the weak.

What's next for MimeSound

Optimally, we would not need Reaper in the background in order to run our program. If we had more time, we would find a more succinct way to implement the VST's. Hopefully, the program would be its own downloadable package and usable out-of-the-box. We would spend more time honing the graphics, adding textures, light, background, particle effects, and an avatar. We would add more instruments, sounds, and options into the game. Eventually, we may have several people playing at once. The possibilities are endless.

Tagline - A virtual orchestra at your fingertips

+ 1 more
Share this project:
×

Updates