Inspiration: Initially, we planned to use the Spotify API (specifically, Meyda) to transfer voice and statistics about it into images (audio visualisation). We wanted to create an interactive virtual environment using Unity. We also wanted to include Google's AI (Deep Dream) because of the creative aspect of AnvilHack II. We didn't manage to use the Spotify API since it was made with Javascript and we used Unity (C#), but we are very happy with our result!

What it does: DreamDeep is an interactive virtual walk-through of our own mind. The environment is pretty abstract and floating objects in the scene represent human memories, thoughts and emotions. Sometimes we remember moments of our lives that we thought were pointless. They flash in our memory. We used Google's Deep Dream to represent these. The user can interact with the objects and retrieve memories out of them.

How we built it: We used Unity3D and C#. We started by planning what was necessary to do and we managed to split the work equally. All of us did our best and put all the effort we could into this, and had a lot of fun doing it. Our team is made of two Unity3D programmers and one 3D modeler.

Challenges we ran into: We found this project to be one of the most difficult hackathon ideas (to implement in 24h) that we've ever had. We struggled with implementing the Spotify API (we really wanted to include it since we thought it would have been an awesome idea!) and spent a lot of time trying to work that out. We also struggled to make the walk-through as efficient as possible, since the graphics and the particles we added required a lot of computational power.

Accomplishments that we're proud of: We are very proud of all the effort we put into this! We are very proud of finishing the project in less than 24 hours and we are very happy of seeing people (friends and also people we met here) being proud of us.

What we learned: While we tried to implement the Spotify API, we learnt how to deal with difference between similar programming languages that, although, are still different in many details. We also learned how to deal with transferring files from 3D modelling software (like Blender) to Unity3D. We managed to split equally all the time that we had available, without forgetting to have fun at the same time.

What's next for DreamDeep: We would like to make it support OculusVR! The kind of environment we created (first-person based) would look amazing on VR and we are really excited at the idea of knowing how it would look! We implemented a plug-in for VR support but none of our laptops can run the headset, therefore we will be waiting the first opportunity to try it on Virtual Reality!

Built With

Share this project:
×

Updates