The Education and Inequality tracks inspired us to create something fun/cool that was inclusive and educational for all types of children. Upon doing some research, we learned that those with intellectual disabilities learn best by breaking down tasks into small steps, performing hands-on tasks, by being in environments where visual aids are used, and by being provided direct and immediate feedback. These teaching methods gave us the idea for the super interactive game- "LeapAhead".
What it does
Our app allows parents to sit down with their kids and help them with simple problems. The parent will read aloud the question and the child will point, using the Leap Motion sensor, to which ever answer he/she thinks is correct. If the child answers correctly, the parent will click the next button and go to the next question.
We take the hand coordinates provided by the Leap Motion tracker and constantly stream it to the front end in order to map the hand to a general position on the screen.
How I built it
We sued AngularJS to create a front end web app that connected to the Leap Motion sensor. We used the Leap Motion JS plugins in order to stream the live position data tot he front end side of the app. We then took that data and made events based off of where the user's hand most recent position.
Challenges I ran into
The backend sub team dealt with challenges involving the Leap Motion sensor. The open source code had a few errors such as providing the wrong number of detected fingers (we were initially planning to have the child user point) and did not have the most precise sensing. Due to this, we decided to have the user move their hand in the general direction of the correct answer as it calls for less precision. The backend team also had difficulty connecting to a server to stream the X and Y coordinates of the hand to the front end team via WebSocket so we went about that in a different way. Initially, we wanted to have the child hover their hand in a particular spot for x amount of time (and then provide a visual sticker congratulating the child for selecting the right answer), but due to time constraints we simplified that function.
The fronted team had some trouble finding the appropriate framework and understanding data binding under the hood.
Accomplishments that I'm proud of
how we worked so well under pressure with stuff we hadn't ever really used before and still managed to realize our vision. We were even trying to set up Oculus at one point but maybe that was too ambitious.
What I learned
Firebase, connecting backend to frontend, new technologies on the spot
What's next for Leap Ahead
We would allow the child to select the correct answer using the Leap Motion sensor. Right now, the parent must click the next button to go onto the next question. Also if a child is selecting the wrong answer continuously we could give them a hint to the right answer.r We could also provide characters that the children are more familiar with. For example if we asked a question about colors and one selection was the color yellow, we could use Spongebob as the answer choice instead of a plain yellow block. This allows children to map familiarity with new concepts. We could develop a reward/achievement system that would allow the children to see their progress. Adding profiles for children could create a more personalized experience for them as well- perhaps even adding a personalized greeting and encouraging phrases/quotes.