Inspiration

We've both built quite a bit of web software and understand that performance is key. However, we also noticed that in the era of minifying JavaScript, CSS and even HTML, the heavy bandwidth offenders, images, seemed essentially left out. We realized that to have a truly snappy page (and one without blurry, mis-proportioned images), something needed to be built.

What it does

Imccelerate intercepts all GET requests and inspects them for known image types. If imccelerate detects a request for an image, it will then scale the image appropriately based on the users screen size and screen density. Then, the resulting image is dropped into an LRU cache to let others with the same screen resolution enjoy a massive speed boost in page loading, up to 90% faster with 90% less bandwidth in some of our tests. Furthermore, imccelerate features a proper image sizing system. Why request a massive, 3x page filling image if its just for a thumbnail? We took a cue from Bootstrap and added a variety of sizes ranging from extra small to extra large. These sizes can seamlessly be appended to each image request. Even better, some of our sizes (including a custom profile image size) feature vision logic from Microsoft Oxford. Finally, we attempted to solve one last issue. What happens if a site comes under heavy load and still needs to deliver lots of content? To attempt to solve this, we built a dynamic CDN offloading system. If an image resource becomes too heavily requested to continue its successful delivery, imccelerate will offload the image to Azures high powered content delivery network and seamlessly move clients to it, all without any interruption in service.

How We built it

We used the Express middleware model to intercept and filter GET requests. Once we were sure we had an image, we used Graphicsmagick to process the image and send it for further processing with Oxford if needed. Once we had our image in the form of a buffer, we dropped it (along with a few other pieces of statistical information) into a least recently used cache. The cache takes care of dropping resources that are infrequently queried, while keeping those that are fresh. We then send out the processed image to any future clients with the same resolution and device display density.

Challenges We ran into

Configuring the live offload CDN was particularly difficult as the documentation pertinent to storage services with Node was a bit lacking.

Accomplishments that We’re proud of

We were able to built something that reduces the required image bandwidth up to 90%. 90%! That's incredible! Using this technology a site could go from slow and blurry to speedy and sharp in a matter of minutes. Furthermore, imccelerate could enable easier development, as all image sizes originate from one resource. No need to design 15 images for 300 different screens.

What We learned

We learned a ton about Azure, network performance monitoring and Express middleware. Also learned how to use GraphicsMagick, an awesome image manipulation library.

What's next for imccelerate

We'd like to make imccelerate more stable and efficient in the future, as well as to incorporate a dynamic quality algorithm to save just a bit more bandwidth. At some point, we'd like to be able to achieve a 95% reduction in bandwidth over our raw images.

Share this project:

Updates