To get the best drawing experience in a computer, you usually need expensive equipment such as graphical pads. The problem is that for occasional users that is not worth the price. Nevertheless, some of the users may have one of the new MacBook which has a huge trackpad that can be used for that purpose. Current solutions use the trackpad as a mouse but that doesn't provide neither precision nor accuracy for drawing. Our goal is to provide a precise and accurate drawing experience by motion tracking.
What it does
As with a graphical tablet, now you can use the trackpad of your Mac to draw in a one-to-one relation. Your trackpad becomes your drawing surface allowing you to draw your own creations by just touching it, avoiding the need of clicking and holding while drawing what eases the process and increases the precision of the result.
Incorporating a Leap Motion detector you can track your current location without actually touching the touchpad, which allows you to decide the point where you want to begin or continue your drawing. At the same time, you can move throughout the different "pages" by just indicating it with your hands thanks to the gesture recognition. With that UI your trackpad will become your new favourite notebook.
How we built it
The solution consists of four main components:
The UI, which is an HTML5 canvas in which we draw. Motion detection for switching workspaces through the camera is also done in JS on the client using WebRTC.
The event server, which is a NodeJS/Socket.io server which serves the static content for the browser and forwards the events from leap motion and trackpad to the UI.
Leap motion detection service, which is a Python program that reads directly from the Leap Motion sensor, does some processing (including calibration) and sends that data to the socket.
Trackpad position detector. A MacOS program that tracks the position (x, y and pressure) of the finger in top of the Magic Trackpad and sends it to the socket as well.
Challenges we ran into
Nobody of the team has ever developed in ObjC for Mac, so building a whole application that is able to read system signals has been really complex (Apple doesn't really provide easy low level access to the hardware).
Integrating of all the systems and coping with high amounts of data per second has lead us to reduce the sampling frequencies of the different systems to avoid possible overloadings of the systems.
Incorporation of the Leap Motion sensor in a current non-usual position introduces some challenges in the way of detecting the hand and its parts and calibrate the trackpad position.
Accomplishments that we're proud of
Integration of different kinds of technologies and systems at the same time and making it work all together. Handling of big amounts of data without overloading the system either introducing lacks in it that affect the user's experience.
What we've learned
Implementation of native Mac application, usage of Socket.io, incorporation of Leap Motion and gesture recognition.
What's next for MotionDrawing
Improving location recognition before touching the trackpad with Leap Motion. Moreover, incorporate colour selection through shaking objects in front of the camera and determine the width of the lines through pressure detection of the trackpad.