Spellflow screenshot of gameplay
Using the application with the motion sensor
More usage of the game with the controller
A computer setup with the leap motion controller
Our project was partially inspired by Brandon Sanderson's The Rithmatist. The book describes a game played with chalk drawing shapes that each serve a different function with the goal of breaking the circle around your opponent. When we saw that TeenHacksLI provides Leap Motion devices, we realized Sanderson's ideas could be adapted to make a similar game in Unity3D with motion controls. The part of the story that immersed both of us so much was being able to construct these geometrical shapes and fight one another with them, and we felt that the Leap Motion controllers allowed us to create a game where we could do just that with only our hands.
What it does
Spellflow pits two local players against each other, to fill the victory bar with their color (blue or red). The players use hand gestures to send bolts at each other and make defensive walls. The user's switching to the drawing mode is indicated by particle effects from the cursor. Drawing lines and taking enemy attacks to your circle decreases your portion of the victory score. The longer the bolt you draw, the more energy from the victory bar it uses, but the faster it goes (thus making it harder to block). The same is true with the length of the walls. Hitting the enemy circle increases victory progress. Pinch to spawn a drone you can control across enemy territory.
How we built it
We take input from the Leap Motion controller into Unity through the Leap Motion Unity assets. We track finger or palm position and rotation. When the palm is turned sideways or up, the user "draws" and when they put their palm face down again, drawing ends. We find a best fit line from the data points in the drawing and create a blocking wall or attacking bolt. Then, we use statistics such as length, angle and the center position of these points to instantiate gameobjects with these properties.
Challenges we ran into
During the first five hours of the hackathon, we worked on a different project, but decided it was not good enough. The contrast between red and blue sections and the use of unity migrated from this earlier project. When we finally started with the controller, we had a lot of trouble importing the unity assets onto our computer, because it was a mac, but eventually we found a version that worked. We then had some difficultly interpreting the raw data from the sensor into x and y coordinates for unity. The sensor returned arbitrary units, so we had to develop a wrapper function that converted these into values usable on the coordinate plane and in a regression algorithm. In addition, the sensor's data was very unstable and skipped around a lot, so we had to do some culling and optimizing the workspace to fit its measurements. The next challenge was ensuring that the bolt objects spawned with the correct length and speed, which was made difficult by the fact that unity does not have procedurally generated GameObjects, so we had to turn our arbitrary coordinate system into a meaningful coordinate length for these objects.