Inspiration

From my previous experiences (internships, projects), I realized many data scientists, including seasoned industry professionals take many concepts for granted.

I've been working a lot with Torch recently and wanted to figure out the inner workings of it's autograd module. More specifically, how machines compute gradients efficiently. This also opens the door to a deeper understanding of things like the vanishing gradient problem & why you need to do .zero_grad() at each step in your training loop.

This was a challenge however, as the library is extremely large & its kernels are written in a harder to understand C++ (CPU) and CUDA (GPU).

What it does

VisuAlgo style step-by-step display of the steps in backprogating a compute graph.

Given a compute graph, for example:

e = a + b
d = e + c
L = d + f

It computes the gradient of each node with respect to L , ie: the final node with no children. In practice, this would typically be the loss function.

The user then has the control to step through each computation manually, or simply cycle through the entire queue with a customizable sleep in between each step.

How we built it

The simple scalar differentiation engine is modelled after Andrey Karparthy's micrograd Python package, rewritten in Typescript.
It implements basic UnaryOps (tanh, ReLU, and exp) and BinaryOps (+, *, etc).

Stack

  • @xyflow/svelte: Graph UI
  • @dagrejs/dagre: Graph Layout
  • shadcn-svelte: UI
  • tailwindcss: Styling

Challenges we ran into

  • Time constraints
  • Clash in abstractions (micrograd vs svelte-flow ) meant I needed to constantly serialize and deserialize the custom Value objects when applying logic & managing state.

Accomplishments that we're proud of

A fully working app (mobile responsive as well!) deployed in 24 hours. Thanks to Cloudflare Pages for the git-based CI/CD.

What we learned

  • How to write a scalar autograd engine from scratch.
  • Interactive graph visualization with Svelte
  • Math + OOP in Typescript is not great
  • I miss Python key-word arguments

What's next for autograd-ui

Due to time constraints, these items are left on my to-do list:

  • [ ] Dynamically update grad in each Value instead of showing it instantly ahead of time
  • [ ] Custom inputs (add your own nodes & edges)

Built With

Share this project:

Updates