API Documentation

Functional API

The functional API provides an interface similar to those of SciPy’s optimize module and MATLAB’s fminunc/fmincon routines. Parameters are provided as a single torch Tensor, and an OptimizeResult instance is returned that includes the optimized parameter value as well as other useful information (e.g. final function value, parameter gradient, etc.).

There are 3 core utilities in the functional API, designed for 3 unique numerical optimization problems.

Unconstrained minimization

minimize(fun, x0, method[, max_iter, tol, …])

Minimize a scalar function of one or more variables.

The minimize() function is a general utility for unconstrained minimization. It implements a number of different routines based on Newton and Quasi-Newton methods for numerical optimization. The following methods are supported, accessed via the method argument:

Constrained minimization

minimize_constr(f, x0[, constr, bounds, …])

Minimize a scalar function of one or more variables subject to bounds and/or constraints.

The minimize_constr() function is a general utility for constrained minimization. Algorithms for constrained minimization use Newton and Quasi-Newton methods on the KKT conditions of the constrained optimization problem.

Note

The minimize_constr() function is currently in early beta. Unlike minimize()–which uses custom, pure PyTorch backend–the constrained solver is a wrapper for SciPy’s ‘trust-constr’ minimization method. CUDA tensors are supported, but CUDA will only be used for function and gradient evaluation, with the remaining solver computations performed on CPU (with numpy arrays).

Nonlinear least-squares

least_squares(fun, x0[, bounds, method, …])

Solve a nonlinear least-squares problem with bounds on the variables.

The least_squares() function is a specialized utility for nonlinear least-squares minimization problems. Algorithms for least-squares revolve around the Gauss-Newton method, a modification of Newton’s method tailored to residual sum-of-squares (RSS) optimization. The following methods are currently supported:

  • Trust-region reflective

  • Dogleg - COMING SOON

  • Gauss-Newton line search - COMING SOON

Optimizer API

The optimizer API provides an alternative interface based on PyTorch’s optim module. This interface follows the schematic of PyTorch optimizers and will be familiar to those migrating from torch.

Minimizer(params[, method])

A general-purpose PyTorch optimizer for unconstrained function minimization.

Minimizer.step(closure)

Perform an optimization step.

The Minimizer class inherits from torch.optim.Optimizer and constructs an object that holds the state of the provided variables. Unlike the functional API, which expects parameters to be a single Tensor, parameters can be passed to Minimizer as iterables of Tensors. The class serves as a wrapper for torchmin.minimize() and can use any of its methods (selected via the method argument) to perform unconstrained minimization.

ScipyMinimizer(params[, method, bounds, …])

A PyTorch optimizer for constrained & unconstrained function minimization.

ScipyMinimizer.step(closure)

Perform an optimization step.

Although the Minimizer class will be sufficient for most problems where torch optimizers would be used, it does not support constraints. Another optimizer is provided, ScipyMinimizer, which supports parameter bounds and linear/nonlinear constraint functions. This optimizer is a wrapper for scipy.optimize.minimize(). When using bound constraints, bounds are passed as iterables with same length as params, i.e. one bound specification per parameter Tensor.