Inspiration
We've been inspired by the Pink Floyd's "The Dark Side of the Moon" album cover, which shows light going through a prism and then dispersing into individual frequencies. It can be interpreted as a difficulty in understanding the complexity of the world when all factors blend in together (the dark side), while each of them making sense individually (the bright side).
Our idea was to make the phone be the prism, and represent the prism's "dark" and "bright" sides with the face and world lenses respectively, letting the user switch between them. Both light and sound are waves, so we immediately thought about representing the light frequencies as sounds. We based the light-audio associations on chromotherapy, so that each colour represents a mood not only visually but also with regards to the sound.
What it does
The user starts in the world lens. They can see colourful light rays converging where they hold their phone (at the phone's screen). Looking around, they explore therapeutic sounds associated with each light colour individually. The user can then switch to the face lens (the other side of the prism, i.e. the dark side of the moon), where the individual light frequencies from the other side converge into white light, while the previously soothing and energising sounds blend into an unsettling and "incomprehensible" unity. The user's face is also treated in black and white has a dark styling applied to it.
How we built it
We've used SparkAR Studio to build the effect. Development was done using JavaScript with Webpack and Babel to facilitate multi-file coding (helpful for collaboration during the hackathon). The world lens effect uses world coordinates to position the light rays around the user, while the face lens uses background segmentation, face tracker and shaders to achieve the desired styling.
Challenges we ran into
Initially the concept for the world lens was to rely on plane tracking and have all the light rays follow the phone (the prism), automatically rotating towards it and adjusting the light rays so they always reach the phone. We've spent a large amount of time and got this working perfectly fine in Spark AR Studio, but the behaviour when testing on an actual device was broken. We have chatted with the Spark AR team and couldn't find a solution to the problem, concluding that it might be a bug or some set of features that are disabled on the device.
We've also faced issues with sound playback on the device, which, after a long debugging session, appeared to be an issue of file format (m4u) and replacing them with .wav's solved the issue.
Accomplishments that we're proud of
We believe that concepts should be coherent and well thought through. We're very proud of the consequence we managed to achieve and maintain throughout the entire process, both in terms of creative, story and science, while staying conceptually very close to the original album cover. Face lens is the dark side of the moon so it's black and white, with unsettling sound that's a blend of the same sounds that beautifully play individually in the world lens (after the light / sound disperses through the prism).
What we learned
Because of the challenges we faced, we have probably played with most of tracking modes in Spark AR, learned how to control sounds, as well as how to merge world and face lenses into a single, coherent experience.
What's next for Prism
After we take a short break from coding, we would love to polish the effect and release it publicly. As our team comprises from only two people, both of who are developers, we would be thrilled to plug an artist to the project in order to polish it further. Either way, we loved the journey so far!
Built With
- javascript
- sparkar


Log in or sign up for Devpost to join the conversation.