Vaporizer

You take a bunch of pictures and then you thrust yourself into the aesthetic of our time and transcend the limitations of time itself.

What is this?

Vaporizer is a photo booth for today's youth. Vaporizer is the future. It is us, and we are Vaporizer.

Begin by taking anywhere between one and three pictures with your camera. Let bake for five minutes. Receive vapor.

Why is this cool?

Vaporizer is written in Python and primarily uses a video processing and editing library called moviepy. There's a bank of assets divided into the following categories:

  • primary: MP4 video files that are stretched to 1280x720 and placed in the background of all layers
  • secondary: PNGs and MP4s that are used as stickers of sorts. Any MP4s are treated as GIFVs and looped for the entirety of the scene.
  • soundtrack: WAV audio files to play (preferably the entirety of Macintosh Plus's Floral Shoppe)
  • frames: PNGs with an alpha mask that indicate where to overlay images. Currently only supports a single rectangular region. You could try other shapes but the script will, in all likelihood, just fit the images to a rectangle mask.

After the user has finished taking pictures, Vaporizer will create three "scenes" where it:

  1. Fits each image into a randomly selected frame.
  2. Selects a random background and places the framed image on top of it in a random position.
  3. Randomly picks between one and three secondary assets to place on the scene.
  4. Selects a random Jaden Smith quote to place on top.
  5. Places the date of the selected quote in the bottom left.

Each scene is 10 seconds long to create a 30 second-long video. Each of these scenes are then concatenated into a single composite video. A gradient from pink to green (for added vapor) is laid over the entire composite at 20% opacity, and then this new composite is written to an MP4 using a hardware-accelerated FFMPEG codec.

How do I use this?

You'd need to install Numpy+Intel MKL and SciPy for most of MoviePy's processing. OpenCV is required for some other video processing, as well as for getting input from the user's webcam. NumPy and SciPy will likely need to be installed from OS distributor or third-party sources. Other than that, everything should be resolved by using pip install moviepy.

From there, clone the Git repository, cd into it and run python cvtest.py.

You should probably know...

MoviePy does a lot of its internal processing using scipy and numpy. This is simple to work with, but the mathematical operations performed by MoviePy significantly slow down the speed of processing and makes the script extremely CPU-bound.

At first, this was caused mainly by the use of nested composites in order to overlay images. By restructuring the program to only composite all layers before exporting, rendering time was more than halved, but this still resulted in a render time over 5 minutes for a 30-second clip. However, this may also be caused by the lack of a dedicated video card on the development machine (aka a laptop).

Share this project:
×

Updates