Differentiation

Documentation for Manifolds.jl's methods and types for finite differences and automatic differentiation.

Differentiation backends

Manifolds.ODEExponentialRetractionType
ODEExponentialRetraction{T<:AbstractRetractionMethod, B<:AbstractBasis} <: AbstractRetractionMethod

Approximate the exponential map on the manifold by evaluating the ODE descripting the geodesic at 1, assuming the default connection of the given manifold by solving the ordinary differential equation

\[\frac{d^2}{dt^2} p^k + Γ^k_{ij} \frac{d}{dt} p_i \frac{d}{dt} p_j = 0,\]

where $Γ^k_{ij}$ are the Christoffel symbols of the second kind, and the Einstein summation convention is assumed.

See solve_exp_ode for further details.

Constructor

ODEExponentialRetraction(
    r::AbstractRetractionMethod,
    b::AbstractBasis=DefaultOrthogonalBasis(),
)

Generate the retraction with a retraction to use internally (for some approaches) and a basis for the tangent space(s).

source
Manifolds._derivativeFunction
_derivative(f, t[, backend::AbstractDiffBackend])

Compute the derivative of a callable f at time t computed using the given backend, an object of type Manifolds.AbstractDiffBackend. If the backend is not explicitly specified, it is obtained using the function default_differential_backend.

This function calculates plain Euclidean derivatives, for Riemannian differentiation see for example differential.

Note

Not specifying the backend explicitly will usually result in a type instability and decreased performance.

source
Manifolds._gradientFunction
_gradient(f, p[, backend::AbstractDiffBackend])

Compute the gradient of a callable f at point p computed using the given backend, an object of type AbstractDiffBackend. If the backend is not explicitly specified, it is obtained using the function default_differential_backend.

This function calculates plain Euclidean gradients, for Riemannian gradient calculation see for example gradient.

Note

Not specifying the backend explicitly will usually result in a type instability and decreased performance.

source
Manifolds._jacobianFunction
_jacobian(f, p[, backend::AbstractDiffBackend])

Compute the jacobian of a callable f at point p computed using the given backend, an object of type AbstractDiffBackend. If the backend is not explicitly specified, it is obtained using the function default_differential_backend.

This function calculates plain Euclidean gradients, for Riemannian gradient calculation see for example gradient.

Note

Not specifying the backend explicitly will usually result in a type instability and decreased performance.

source

ForwardDiff.jl

FiniteDifferenes.jl

Riemannian differentiation backends

Manifolds.RiemannianProjectionBackendType
RiemannianProjectionBackend <: AbstractRiemannianDiffBackend

This backend computes the differentiation in the embedding, which is currently limited to the gradient. Let $mathcal M$ denote a manifold embedded in some $R^m$, where $m$ is usually (much) larger than the manifold dimension. Then we require three tools

  • A function $f̃: ℝ^m → ℝ$ such that its restriction to the manifold yields the cost function $f$ of interest.
  • A project function to project tangent vectors from the embedding (at $T_pℝ^m$) back onto the tangent space $T_p\mathcal M$. This also includes possible changes of the representation of the tangent vector (e.g. in the Lie algebra or in a different data format).
  • A change_representer for non-isometrically embedded manifolds, i.e. where the tangent space $T_p\mathcal M$ of the manifold does not inherit the inner product from restriction of the inner product from the tangent space $T_pℝ^m$ of the embedding

For more details see [AbsilMahonySepulchre2008], Section 3.6.1 for a derivation on submanifolds.

source
Manifolds.TangentDiffBackendType
TangentDiffBackend <: AbstractRiemannianDiffBackend

A backend that uses a tangent space and a basis therein to derive an intrinsic differentiation scheme.

Since it works in a tangent space, methods might require a retraction and an inverse retraction as well as a basis.

In the tangent space itself, this backend then employs an (Euclidean) AbstractDiffBackend

Constructor

TangentDiffBackend(diff_backend)

where diff_backend is an AbstractDiffBackend to be used on the tangent space.

With the keyword arguments

source
Manifolds.differentialMethod
differential(M::AbstractManifold, f, t::Real, backend::AbstractDiffBackend)
differential!(M::AbstractManifold, f, X, t::Real, backend::AbstractDiffBackend)

Compute the Riemannian differential of a curve $f: ℝ\to M$ on a manifold M represented by function f at time t using the given backend. It is calculated as the tangent vector equal to $\mathrm{d}f_t(t)[1]$.

The mutating variant computes the differential in place of X.

source
Manifolds.gradientMethod
gradient(M::AbstractManifold, f, p, backend::AbstractRiemannianDiffBackend)
gradient!(M::AbstractManifold, f, X, p, backend::AbstractRiemannianDiffBackend)

Compute the Riemannian gradient $∇f(p)$ of a real-valued function $f:\mathcal M \to ℝ$ at point p on the manifold M using the specified AbstractRiemannianDiffBackend.

The mutating variant computes the gradient in place of X.

source
Manifolds.gradientMethod
gradient(M, f, p, backend::TangentDiffBackend)

This method uses the internal backend.diff_backend (Euclidean) on the function

\[ f(\retr_p(\cdot))\]

which is given on the tangent space. In detail, the gradient can be written in terms of the backend.basis. We illustrate it here for an AbstractOrthonormalBasis, since that simplifies notations:

\[\operatorname{grad}f(p) = \operatorname{grad}f(p) = \sum_{i=1}^{d} g_p(\operatorname{grad}f(p),X_i)X_i = \sum_{i=1}^{d} Df(p)[X_i]X_i\]

where the last equality is due to the definition of the gradient as the Riesz representer of the differential.

If the backend is a forward (or backward) finite difference, these coefficients in the sum can be approximates as

\[DF(p)[Y] ≈ \frac{1}{h}\bigl( f(\exp_p(hY)) - f(p) \bigr)\]

writing $p=\exp_p(0)$ we see that this is a finite difference of $f\circ\exp_p$, i.e. for a function on the tangent space, so we can also use other (Euclidean) backends

source