Lighting sets an appropriate atmosphere and heightens an audience's understanding of a theatrical performance. However, rehearsing and experimenting with lighting is costly and time consuming. That is why we created a virtual lighting lab that is universally accessible, affordable, and fun! :)

What it does

There are two major components for LIT.

1) The LIT lighting lab provides a real-time rendering of the stage to the user in response to any change made to light's color, intensity, and the camera perspective.The user can toggle light helpers. The spotlight palette is pre-populated with LEE's filter colors link for users to experiment with, and they may also save the cue configuration and download the scene's images for future reference. The user can zoom by scrolling, rotate by dragging, and pan by holding middle mouse button and dragging.

2) The LIT VR Viewer puts users in the stages they designed. It uses device orientation so users can use a cardboard to look around the stage. By saying "cue," users can transition to a scene with different lighting and perspective. Demonstration of cue: link.

How we built it

We used three.js to set up our 3d environment. Using jQuery, we linked our Material Design Lite (MDL) UI components to the spotlight objects so users could modify the color, intensity, and light helpers. We used Python's Beautiful Soup to scrape LEE's color filters. We used JavaScript to enable cues that can be saved and allow the user to download photos of the stage.We also created a VR mode by hyperlinking to the stage fullscreen and using device orientation to respond to the user's movements. Anyanng's voice recognition API allowed the user to cue through scenes in VR Mode.

Challenges we ran into

Threejs had many deprecated methods.

It was my first time programming in JavaScript, and I didn't realize the ordering of scripts and function calls mattered. That led to unexpected challenges because the logic in the code was correct but it still didn't compile. We weren't sure if the renderer couldn't handle too many lights, the virtual model was too big, or there was an issue with the code. We started over using Threejs's WebVR scripts. However, the user had to enable the site's WebVR, which would have made LIT less accessible. We were able to configure VR using device orientation and stereo effect by editing and reorganizing the code.

Accomplishments that we're proud of

Intuitive Design VR mode / Voice Cues

What we learned

This was my first time programming in Javascript, Python, and HTML. I learned how to program a 3d environment and interactive GUI, scrape information, and enable VR and voice recognition using these programs.

What's next for LIT

This prototype is based on Penn's PAC shop: link. This is a demonstration of LightWeb, the light simulation we based our program on: link. We look forward to adding more features such as different types of spotlights and allowing the user to change the location of the cameras.

Share this project: