Inspiration

The inspiration for sotorch appeared when I started implementing Visual Odometry in some experiments of my MSc course. I saw that I could save a lot of time if I used the automatic differentiation of PyTorch's autograd instead of implementing the Jacobians and Hessians by hand. Also, I saw that, for this type of problem (that requires second-order optimization), It was better to use some of the optimization algorithms implemented in SciPy instead of the optimizers that come with PyTorch - I tried different optimizers from both libraries and L-BFGS-B from SciPy worked better for me. Then I decided to combine those features of each library: autograd from PyTorch and optimizers from SciPy.

What it does

sotorch saves the user from the work of writing Jacobians and Hessians manually, as it uses PyTorch in order to get the analytic Jacobians and Hessians (in the majority of cases PyTorch provides the analytical gradients). The analytic gradients are preferred over their numeric counterpart because of computing cost and precision. The user is free to define any objective function, as long as it is composed of differentiable operations from PyTorch.

How we built it

We borrowed some ideas contained in a GitHub issue discussion at the PyTorch repository - it is properly referenced in sotorch's repository. We also use almost the same interface of scipy.optimize.minimize. So sotorch is kind of a wrapper of scipy.optimize.minimize equipped with automatic Jacobians and Hessians.

Challenges we ran into

We had to complete the first version of sotorch and make this submission with a lot of parallel work going on in our workplace. It was a challenge but we are proud of completing it.

Accomplishments that we're proud of

We were proud of seeing sotorch being imported like a "proper module" because it was a novelty for us.

What we learned

We had to do some new things, like adding a license file and creating a "installable wheel" for using sotorch as a module. Now we want to make it public in some package repository.

What's next for sotorch

We would like to extend sotorch for other optimizers, allow the usage of custom optimizers, add support for Jacobian-vector and Hessian-vector products, and give support to parallelization in GPU.

Share this project:

Updates