Inspired by this year’s unexpected task of creating a visually interactive Swift playground scene, I used the opportunity to challenge myself to work with areas of iOS development of which I am less familiar with. Brainstorming potential candidates for games, I wanted to create something which would function intuitively within a playground environment, but would also utilise as many relevant technologies as possible. I identified that a combination of audio, animations and gestures would be key to creating an engaging playground to represent my skills. This led me to develop a musical game where the player builds their score by moving a plectrum in-time with the rhythm of a song. The audio used in the game is a backing-track of which I recorded rhythm and lead guitars myself using GarageBand.

How I built it

The design of the main game UIViewController stemmed from a concept I envisioned where the background would feature a waveform animation. To create this, I relied on CoreGraphics to render the waveform lines based on a provided value. After some research and experimentation, I discovered how to convert an AVAudioPlayer’s power level to a value which could represent the signal produced by the song in the waveform.

During development, I discovered the difficulty of targeting the rapidly-moving note views while using a trackpad. Traditionally, games of this style have been designed either for a touchscreen, or for use with multi-button input devices. I therefore attached a UIPanGestureRecognizer to a UIImageView which represented a vertically-draggable plectrum. This instantly made the game more enjoyable and accessible due to its quick nature of movement.

The last remaining challenge was to monitor for collisions between the plectrum and the notes. Fortunately, this step was made effortless by the abilities of CGRect and I was able to filter out active note views based on their location on the screen. As notes collide with the plectrum, a combination of CoreGraphics and UIView animations indicate to the player that they have successfully scored a point.

Challenges I ran into

Without a doubt, the most challenging aspect of this project was accurately animating the notes so that they would remain synced with the background track as they moved across the screen. I initially intended to use relative Timers and DispatchTime deadlines, however these proved inaccurate on occasion due to the nature of those technologies. Using my creativity and problem-solving abilities, I developed a completely separate application for iOS which would automatically generate code as I tapped along to the song. Utilising AVFoundation and gestures, I could track the precise timing of each note and their pitch and instantly convert this information to lines of code to be transferred to my playground. As soon as I implemented the scrolling animation, which moved based on the song’s BPM, I had a working timeline!

What's next for Fret Run

As this project was developed as a scholarship application, I don't intend to work on any further updates. Having said that, it could potentially be a fun game for iOS, so I might consider releasing it at some point!

Share this project: