Existing light painting practices make the art both difficult and inflexible. Even following the position of your own hand is next to impossible, let alone creating more advanced pieces incorporating multiple colors or intensities of light. The Light Painter’s Palette greatly simplifies this process by overlaying digital media in the real world to bring your artwork to the next level.
The Light Painter’s Palette is an intelligent light art assistance tool that automatically generates seamless color transitions as the user moves the light pen. Existing digital media can be overlaid in space to modulate the pen’s color. The Light Painter’s Palette also provides helpful on screen guides in order aid the user’s hand proprioception when creating more complicated light art or to aid the beginner in learning the craft.
Context of Creation
The Light Painter’s Palette was created on January 10th and 11th as part of Dragon Hacks 2015, held at Drexel University in Philadelphia, Pennsylvania. The Light Painter’s Palette was created by Christopher Frederickson, Nick Felker, and Max Bareiss.
Light painting is the art of making beautiful designs using a streak of light in a long exposure photograph. A camera is placed at a strategic location, and captures a single image over a long period of time. The combination of all light captured during this period is rendered into a single image. An individual can take advantage of this to create artistic designs in the light that is captured.
- Displays artist’s current and past hand position in real-time to help with artwork accuracy
- Positioning system works regardless of ambient light level
- Presents tracing images to enhance the creativity of the artist
- Allows for natural gesture control to place and scale tracing images
- Controlled by an ordinary laptop using standard Commercial-off-the-Shelf hardware
- Control pen communicates using WiFi technology to allow for untethered use
- Pen relays current status to a Hipchat room *Occasionally recommends listening to genres of music, selected from Beats Music, in the chat, transmitted from the pen
During the twenty-four hour development time, the design of the prototype progressed through four distinct phases; the prototyping stage, ending in version 1.0; the first and second polishing stages, ending in versions 1.1 and 1.2, respectively, and the final polishing stage, culminating in the design presentation. Version 1.0 of the prototype had a pen that was able to change color according to the RGB color value transmitted to it from the computer base station. The base station started as a modified version of a Kinect SDK sample and was able to transmit the artist’s hand position.
The group frequently ran into Wi-Fi issues when testing these prototypes. The first prototype had very high latency, with only a few updates each second. This was because the color data was broadcast to every device on the router instead of only the Edison’s IP address. Changing this to only one device increased the speed tremendously.
At first, the IP address was hardcoded, but this didn’t seem like a definitive solution. So, the Edison was given a broadcast mode, where it transmitted its IP address, port number, and other relevant information. A device like the base station could use these messages to identify the correct device. Once messages from the base station reached the Edison, broadcast mode was disabled and the devices were effectively paired.
However, this was still not a perfect solution. The closest room dark enough was also very far from any router. This caused latency issues between the pen and the base station. If the pen didn’t connect, there would need to be an arduous process of rebooting and connecting to Wi-Fi again.
This lead to two additions which made the connectivity on the pen more adaptable. A reset button allowed the device to immediately switch back to broadcasting mode, which sped up the testing process. Additionally, the device returns to broadcast mode if it hasn’t received any messages for ten seconds, making it easy to reconnect automatically.
This led to the issue of figuring out the current mode of the Edison. The interface essentially is just a single LED. A combination of different colors and brightness were added to give clear indication of the current activity. For example, broadcasting mode consists of the LED smoothly transitioning between a dim and bright white light. A red light gives indication that some sort of software error has occurred, and a blue light indicates some sort of transition is taking place. MLH Light Painted Logo
Appendix A - Changelog
Constructed pen electronics Edison read values transmitted wirelessly from the base station and changed the RGB values of the pen LED Initial Base Station logic was implemented.
Removed Filter Caps (after trying more filter caps) Trim Pots for colors & calibrated to be white Broadcast mode from Edison to PC Twist LED so direct light doesn't shine in camera (better diffusion) Many Edison networking bugs were revised
Added Speed Gauge on Kinect UI Added Trail in Kinect UI Added Image Scaling and Moving, Minority Report Style Fixed most (if not all) network bugs in Edison, many issues self-correct Made LED have some feedback when it's not in receiving mode Hipchat and Beats Audio integration in the Edison