Inspiration
There’s something special about the music you’ve listened to and the place you’ve listened to it. We imagined how meaningful it could be to walk along the Lakefill and interact with the gallery that is your “Music Memories.”
What it does
Whenever a user begins to play a new song, a visual representation of the song is created in an augmented reality (AR). This takes the form of the album art in the location of the device. If a user taps on the album art, some details about the memory are displayed at the bottom of the screen: the song title, the artist, the album, and the timestamp of the listening experience. As more and more songs are listened to, the collection grows.
How we built it
We built an iOS app in Swift that uses (and overwrites) some of the ARCL package functionality (ARKit + CoreLocation) to create the AR experience. It also uses the MediaPlayer framework to grab what track the user is listening to and stores it in the app. It uses SceneKit to create the geometrical AR objects.
Challenges we ran into
When using the ARCL package, we found that placing an object at a location (with GPS coordinates) caused it to jump around the screen and act unstable (we had to override the function—looks like they had a bug). In addition, when using API to do a hit test (see if the user touched the album art) we obtained a memory address that essentially pointed to nothing. But after a lot of unwrapping (poorly designed API) we obtained the memory address to the object we needed.
Accomplishments that we're proud of
Modifying already existing API because it dosen’t work to the quality you require is a really empowering feeling!
What we learned
Nicole: This was my first iOS app, so I learned about Swift, XCode, and Cocoapods in addition to the specifics of our project. This was also my first group coding project, so I also learned about Github.
Kimberly: This was my first time working on an iOS project, and I learned a lot about the differences of the platform as well as Swift as a language! We pair programmed a lot and I learned a lot about iOS-specific frameworks like SceneKit and ARKit as well as OOP in general :)
Meg: I learned a lot about 3D rendering and all of its functionality and concepts—I had never worked with SceneKit or ARKit before! I also talked to some folks and learned about some cool software engineering principles—like the real point of ivars when storing data coming in asynchronously :)
What's next for Music Memories
We have many ideas for the future of Music Memories. First, we could display the metadata in an annotation in the augmented reality, as opposed to a UILabel. Also, we could add a feature where a user could force touch on album art to begin playing the song, and the volume could be controlled by the distance between the user and the album art. We might collapse songs very near to each other into a single location, and the user could swipe to view each song. Finally, we could add a social feature where users can see what their friends have listened to in addition to their own music histories.
Log in or sign up for Devpost to join the conversation.