Artists With Souls (AWS)

Inspiration

As AI models continue to evolve, the line between inspiration and scraping has become increasingly blurred. Many creators feel that the essence of their work is being absorbed into massive training datasets without permission or attribution. Artists With Souls (AWS) was created as a form of digital self-defense — a way for artists to protect their visual identity and retain control over how their work is used online.

How We Built It

AWS is built around a local-first processing pipeline designed for privacy, speed, and accessibility.

The Adversarial Engine

At the core of AWS is an adversarial machine learning pipeline designed to disrupt computer vision models commonly used for large-scale image scraping and dataset generation. We used a pre-trained ResNet-50 model as a proxy for modern vision architectures and computed the gradient of the model’s loss with respect to the input image.

We implemented two primary attack methods:

  • FGSM (Fast Gradient Sign Method): a fast, single-step perturbation method optimized for speed.
  • PGD (Projected Gradient Descent): a stronger iterative attack that maintains effectiveness even after compression or minor image transformations.

The perturbation process follows the adversarial formulation:

$$ x_{adv} = \text{Clip}_{x, \epsilon} { x + \epsilon \cdot \text{sign}(\nabla_x L(\theta, x, y)) } $$

The resulting changes are nearly imperceptible to the human eye but significantly interfere with how neural networks interpret the image.

Grad-CAM Visualization

To make the effects measurable and transparent, AWS integrates Grad-CAM (Gradient-weighted Class Activation Mapping). This allows users to visualize where an AI model focuses its attention before and after protection is applied.

In many cases, the model’s attention shifts away from the primary subject toward irrelevant background regions after processing, demonstrating how the adversarial perturbations disrupt feature recognition.

Web Stack

AWS was designed as a lightweight full-stack application:

  • Backend: Flask (Python) handles tensor operations, adversarial generation, and in-memory image processing.
  • Frontend: A custom interface built with Spline 3D and modern CSS, using a glassmorphism-inspired design system to create a clean and interactive experience.

What We Learned

Building AWS gave us practical insight into the fragility of modern deep learning systems. Even highly capable vision models can be disrupted by extremely small, carefully structured perturbations.

We also learned the importance of optimization when deploying machine learning systems on CPU-based infrastructure. Early versions of the pipeline were too computationally expensive for real-time use, which pushed us to redesign the processing workflow around a lower-resolution adversarial pass.

Challenges We Faced

Performance

Our initial implementation processed full-resolution images directly, resulting in 20+ second processing times for high-resolution uploads on CPU hardware.

To solve this, we redesigned the pipeline to generate perturbations on a downscaled 512px tensor before upscaling and blending the noise back into the original image. This reduced processing time by more than 90% while preserving attack effectiveness.

Visual Fidelity

Balancing protection strength with image quality was one of the hardest parts of the project. Stronger perturbations improved robustness but introduced visible artifacts.

To address this, we implemented a live PSNR (Peak Signal-to-Noise Ratio) metric so users could objectively evaluate the tradeoff between visual quality and adversarial strength.

What’s Next

We plan to expand AWS beyond a single-model approach by implementing ensemble-based adversarial generation. Instead of targeting only ResNet-50, future versions will generate perturbations effective across multiple architectures, including Vision Transformers (ViTs) and convolutional networks.

Our long-term goal is to provide creators with a more universal layer of protection against automated scraping and unauthorized AI training pipelines.

Built With

Share this project:

Updates