A manifold objective
The Objective describes that actual cost function and all its properties.
Manopt.AbstractManifoldObjective
β TypeAbstractManifoldObjective{E<:AbstractEvaluationType}
Describe the collection of the optimization function `f: \mathcal M β \bbR
(or even a vectorial range) and its corresponding elements, which might for example be a gradient or (one or more) proximal maps.
All these elements should usually be implemented as functions (M, p) -> ...
, or (M, X, p) -> ...
that is
- the first argument of these functions should be the manifold
M
they are defined on - the argument
X
is present, if the computation is performed in-place ofX
(seeInplaceEvaluation
) - the argument
p
is the place the function ($f$ or one of its elements) is evaluated at.
the type T
indicates the global AbstractEvaluationType
.
Manopt.AbstractDecoratedManifoldObjective
β TypeAbstractDecoratedManifoldObjective{E<:AbstractEvaluationType,O<:AbstractManifoldObjective}
A common supertype for all decorators of AbstractManifoldObjective
s to simplify dispatch. The second parameter should refer to the undecorated objective (the most inner one).
Which has two main different possibilities for its containing functions concerning the evaluation mode, not necessarily the cost, but for example gradient in an AbstractManifoldGradientObjective
.
Manopt.AbstractEvaluationType
β TypeAbstractEvaluationType
An abstract type to specify the kind of evaluation a AbstractManifoldObjective
supports.
Manopt.AllocatingEvaluation
β TypeAllocatingEvaluation <: AbstractEvaluationType
A parameter for a AbstractManoptProblem
indicating that the problem uses functions that allocate memory for their result, they work out of place.
Manopt.InplaceEvaluation
β TypeInplaceEvaluation <: AbstractEvaluationType
A parameter for a AbstractManoptProblem
indicating that the problem uses functions that do not allocate memory but work on their input, in place.
Manopt.evaluation_type
β Functionevaluation_type(mp::AbstractManoptProblem)
Get the AbstractEvaluationType
of the objective in AbstractManoptProblem
mp
.
evaluation_type(::AbstractManifoldObjective{Teval})
Get the AbstractEvaluationType
of the objective.
Decorators for objectives
An objective can be decorated using the following trait and function to initialize
Manopt.dispatch_objective_decorator
β Functiondispatch_objective_decorator(o::AbstractManoptSolverState)
Indicate internally, whether an AbstractManifoldObjective
o
to be of decorating type, it stores (encapsulates) an object in itself, by default in the field o.objective
.
Decorators indicate this by returning Val{true}
for further dispatch.
The default is Val{false}
, so by default an state is not decorated.
Manopt.is_objective_decorator
β Functionis_object_decorator(s::AbstractManifoldObjective)
Indicate, whether AbstractManifoldObjective
s
are of decorator type.
Manopt.decorate_objective!
β Functiondecorate_objective!(M, o::AbstractManifoldObjective)
decorate the AbstractManifoldObjective
o
with specific decorators.
Optional arguments
optional arguments provide necessary details on the decorators. A specific one is used to activate certain decorators.
cache
: (missing
) specify a cache. Currently:Simple
is supported and:LRU
if you loadLRUCache.jl
. For this case a tuple specifying what to cache and how many can be provided, has to be specified. For example(:LRU, [:Cost, :Gradient], 10)
states that the last 10 used cost function evaluations and gradient evaluations should be stored. Seeobjective_cache_factory
for details.count
: (missing
) specify calls to the objective to be called, seeManifoldCountObjective
for the full listobjective_type
: (:Riemannian
) specify that an objective is:Riemannian
or:Euclidean
. The:Euclidean
symbol is equivalent to specifying it as:Embedded
, since in the end, both refer to converting an objective from the embedding (whether its Euclidean or not) to the Riemannian one.
See also
Embedded objectives
Manopt.EmbeddedManifoldObjective
β TypeEmbeddedManifoldObjective{P, T, E, O2, O1<:AbstractManifoldObjective{E}} <:
AbstractDecoratedManifoldObjective{O2, O1}
Declare an objective to be defined in the embedding. This also declares the gradient to be defined in the embedding, and especially being the Riesz representer with respect to the metric in the embedding. The types can be used to still dispatch on also the undecorated objective type O2
.
Fields
objective
: the objective that is defined in the embeddingp
: (nothing
) a point in the embedding.X
: (nothing
) a tangent vector in the embedding
When a point in the embedding p
is provided, embed!
is used in place of this point to reduce memory allocations. Similarly X
is used when embedding tangent vectors
Cache objective
Since single function calls, for example to the cost or the gradient, might be expensive, a simple cache objective exists as a decorator, that caches one cost value or gradient.
It can be activated/used with the cache=
keyword argument available for every solver.
Manopt.reset_counters!
β Functionreset_counters(co::ManifoldCountObjective, value::Integer=0)
Reset all values in the count objective to value
.
Manopt.objective_cache_factory
β Functionobjective_cache_factory(M::AbstractManifold, o::AbstractManifoldObjective, cache::Symbol)
Generate a cached variant of the AbstractManifoldObjective
o
on the AbstractManifold M
based on the symbol cache
.
The following caches are available
:Simple
generates aSimpleManifoldCachedObjective
:LRU
generates aManifoldCachedObjective
where you should use the form(:LRU, [:Cost, :Gradient])
to specify what should be cached or(:LRU, [:Cost, :Gradient], 100)
to specify the cache size. Here this variant defaults to(:LRU, [:Cost, :Gradient], 100)
, caching up to 100 cost and gradient values.[1]
objective_cache_factory(M::AbstractManifold, o::AbstractManifoldObjective, cache::Tuple{Symbol, Array, Array})
objective_cache_factory(M::AbstractManifold, o::AbstractManifoldObjective, cache::Tuple{Symbol, Array})
Generate a cached variant of the AbstractManifoldObjective
o
on the AbstractManifold M
based on the symbol cache[1]
, where the second element cache[2]
are further arguments to the cache and the optional third is passed down as keyword arguments.
For all available caches see the simpler variant with symbols.
A simple cache
A first generic cache is always available, but it only caches one gradient and one cost function evaluation (for the same point).
Manopt.SimpleManifoldCachedObjective
β Type SimpleManifoldCachedObjective{O<:AbstractManifoldGradientObjective{E,TC,TG}, P, T,C} <: AbstractManifoldGradientObjective{E,TC,TG}
Provide a simple cache for an AbstractManifoldGradientObjective
that is for a given point p
this cache stores a point p
and a gradient $\operatorname{grad} f(p)$ in X
as well as a cost value $f(p)$ in c
.
Both X
and c
are accompanied by booleans to keep track of their validity.
Constructor
SimpleManifoldCachedObjective(M::AbstractManifold, obj::AbstractManifoldGradientObjective; kwargs...)
Keyword
p
: (rand(M)
) a point on the manifold to initialize the cache withX
: (get_gradient(M, obj, p)
orzero_vector(M,p)
) a tangent vector to store the gradient in, see alsoinitialize
c
: (get_cost(M, obj, p)
or0.0
) a value to store the cost function ininitialize
initialized
: (true
) whether to initialize the cachedX
andc
or not.
A generic cache
For the more advanced cache, you need to implement some type of cache yourself, that provides a get!
and implement init_caches
. This is for example provided if you load LRUCache.jl
. Then you obtain
Manopt.ManifoldCachedObjective
β TypeManifoldCachedObjective{E,P,O<:AbstractManifoldObjective{<:E},C<:NamedTuple{}} <: AbstractDecoratedManifoldObjective{E,P}
Create a cache for an objective, based on a NamedTuple
that stores some kind of cache.
Constructor
ManifoldCachedObjective(M, o::AbstractManifoldObjective, caches::Vector{Symbol}; kwargs...)
Create a cache for the AbstractManifoldObjective
where the Symbols in caches
indicate, which function evaluations to cache.
Supported symbols
Symbol | Caches calls to (incl. ! variants) | Comment |
---|---|---|
:Constraints | get_constraints | vector of numbers |
:Cost | get_cost | |
:EqualityConstraint | get_equality_constraint | numbers per (p,i) |
:EqualityConstraints | get_equality_constraints | vector of numbers |
:GradEqualityConstraint | get_grad_equality_constraint | tangent vector per (p,i) |
:GradEqualityConstraints | get_grad_equality_constraints | vector of tangent vectors |
:GradInequalityConstraint | get_inequality_constraint | tangent vector per (p,i) |
:GradInequalityConstraints | get_inequality_constraints | vector of tangent vectors |
:Gradient | get_gradient (M,p) | tangent vectors |
:Hessian | get_hessian | tangent vectors |
:InequalityConstraint | get_inequality_constraint | numbers per (p,j) |
:InequalityConstraints | get_inequality_constraints | vector of numbers |
:Preconditioner | get_preconditioner | tangent vectors |
:ProximalMap | get_proximal_map | point per (p,Ξ»,i) |
:StochasticGradients | get_gradients | vector of tangent vectors |
:StochasticGradient | get_gradient (M, p, i) | tangent vector per (p,i) |
:SubGradient | get_subgradient | tangent vectors |
:SubtrahendGradient | get_subtrahend_gradient | tangent vectors |
Keyword arguments
p
: (rand(M)
) the type of the keys to be used in the caches. Defaults to the default representation onM
.value
: (get_cost(M, objective, p)
) the type of values for numeric values in the cacheX
: (zero_vector(M,p)
) the type of values to be cached for gradient and Hessian calls.cache
: ([:Cost]
) a vector of symbols indicating which function calls should be cached.cache_size
: (10
) number of (least recently used) calls to cachecache_sizes
: (Dict{Symbol,Int}()
) a named tuple or dictionary specifying the sizes individually for each cache.
Manopt.init_caches
β Functioninit_caches(caches, T::Type{LRU}; kwargs...)
Given a vector of symbols caches
, this function sets up the NamedTuple
of caches, where T
is the type of cache to use.
Keyword arguments
p
: (rand(M)
) a point on a manifold, to both infer its type for keys and initialize cachesvalue
: (0.0
) a value both typing and initialising number-caches, the default is for (Float) values like the cost.X
: (zero_vector(M, p)
a tangent vector atp
to both type and initialize tangent vector cachescache_size
: (10
) a default cache size to usecache_sizes
: (Dict{Symbol,Int}()
) a dictionary of sizes for thecaches
to specify different (non-default) sizes
init_caches(M::AbstractManifold, caches, T; kwargs...)
Given a vector of symbols caches
, this function sets up the NamedTuple
of caches for points/vectors on M
, where T
is the type of cache to use.
Count objective
Manopt.ManifoldCountObjective
β TypeManifoldCountObjective{E,P,O<:AbstractManifoldObjective,I<:Integer} <: AbstractDecoratedManifoldObjective{E,P}
A wrapper for any AbstractManifoldObjective
of type O
to count different calls to parts of the objective.
Fields
counts
a dictionary of symbols mapping to integers keeping the counted valuesobjective
the wrapped objective
Supported symbols
Symbol | Counts calls to (incl. ! variants) | Comment |
---|---|---|
:Constraints | get_constraints | |
:Cost | get_cost | |
:EqualityConstraint | get_equality_constraint | requires vector of counters |
:EqualityConstraints | get_equality_constraints | does not count single access |
:GradEqualityConstraint | get_grad_equality_constraint | requires vector of counters |
:GradEqualityConstraints | get_grad_equality_constraints | does not count single access |
:GradInequalityConstraint | get_inequality_constraint | requires vector of counters |
:GradInequalityConstraints | get_inequality_constraints | does not count single access |
:Gradient | get_gradient (M,p) | |
:Hessian | get_hessian | |
:InequalityConstraint | get_inequality_constraint | requires vector of counters |
:InequalityConstraints | get_inequality_constraints | does not count single access |
:Preconditioner | get_preconditioner | |
:ProximalMap | get_proximal_map | |
:StochasticGradients | get_gradients | |
:StochasticGradient | get_gradient (M, p, i) | |
:SubGradient | get_subgradient | |
:SubtrahendGradient | get_subtrahend_gradient |
Constructors
ManifoldCountObjective(objective::AbstractManifoldObjective, counts::Dict{Symbol, <:Integer})
Initialise the ManifoldCountObjective
to wrap objective
initializing the set of counts
ManifoldCountObjective(M::AbtractManifold, objective::AbstractManifoldObjective, count::AbstractVecor{Symbol}, init=0)
Count function calls on objective
using the symbols in count
initialising all entries to init
.
Internal decorators
Manopt.ReturnManifoldObjective
β TypeReturnManifoldObjective{E,O2,O1<:AbstractManifoldObjective{E}} <:
AbstractDecoratedManifoldObjective{E,O2}
A wrapper to indicate that get_solver_result
should return the inner objective.
The types are such that one can still dispatch on the undecorated type O2
of the original objective as well.
Specific Objective typed and their access functions
Cost objective
Manopt.AbstractManifoldCostObjective
β TypeAbstractManifoldCostObjective{T<:AbstractEvaluationType} <: AbstractManifoldObjective{T}
Representing objectives on manifolds with a cost function implemented.
Manopt.ManifoldCostObjective
β TypeManifoldCostObjective{T, TC} <: AbstractManifoldCostObjective{T, TC}
specify an AbstractManifoldObjective
that does only have information about the cost function $f: \mathbb M β β$ implemented as a function (M, p) -> c
to compute the cost value c
at p
on the manifold M
.
cost
: a function $f: \mathcal M β β$ to minimize
Constructors
ManifoldCostObjective(f)
Generate a problem. While this Problem does not have any allocating functions, the type T
can be set for consistency reasons with other problems.
Used with
Access functions
Manopt.get_cost
β Functionget_cost(amp::AbstractManoptProblem, p)
evaluate the cost function f
stored within the AbstractManifoldObjective
of an AbstractManoptProblem
amp
at the point p
.
get_cost(M::AbstractManifold, obj::AbstractManifoldObjective, p)
evaluate the cost function f
defined on M
stored within the AbstractManifoldObjective
at the point p
.
get_cost(M::AbstractManifold, mco::AbstractManifoldCostObjective, p)
Evaluate the cost function from within the AbstractManifoldCostObjective
on M
at p
.
By default this implementation assumed that the cost is stored within mco.cost
.
get_cost(TpM, trmo::TrustRegionModelObjective, X)
Evaluate the tangent space TrustRegionModelObjective
\[m(X) = f(p) + β¨\operatorname{grad} f(p), X β©_p + \frac{1}{2} β¨\operatorname{Hess} f(p)[X], Xβ©_p.\]
get_cost(TpM, trmo::AdaptiveRagularizationWithCubicsModelObjective, X)
Evaluate the tangent space AdaptiveRagularizationWithCubicsModelObjective
\[m(X) = f(p) + β¨\operatorname{grad} f(p), X β©_p + \frac{1}{2} β¨\operatorname{Hess} f(p)[X], Xβ©_p + \frac{Ο}{3} \lVert X \rVert^3,\]
at X
, cf. Eq. (33) in [ABBC20].
get_cost(M::AbstractManifold, sgo::ManifoldStochasticGradientObjective, p, i)
Evaluate the i
th summand of the cost.
If you use a single function for the stochastic cost, then only the index Γ¬=1
` is available to evaluate the whole cost.
get_cost(M::AbstractManifold,emo::EmbeddedManifoldObjective, p)
Evaluate the cost function of an objective defined in the embedding by first embedding p
before calling the cost function stored in the EmbeddedManifoldObjective
.
and internally
Manopt.get_cost_function
β Functionget_cost_function(amco::AbstractManifoldCostObjective)
return the function to evaluate (just) the cost $f(p)=c$ as a function (M,p) -> c
.
Gradient objectives
Manopt.AbstractManifoldGradientObjective
β TypeAbstractManifoldGradientObjective{E<:AbstractEvaluationType, TC, TG} <: AbstractManifoldCostObjective{E, TC}
An abstract type for all objectives that provide a (full) gradient, where T
is a AbstractEvaluationType
for the gradient function.
Manopt.ManifoldGradientObjective
β TypeManifoldGradientObjective{T<:AbstractEvaluationType} <: AbstractManifoldGradientObjective{T}
specify an objective containing a cost and its gradient
Fields
cost
: a function $f: \mathcal M β β$gradient!!
: the gradient $\operatorname{grad}f: \mathcal M β \mathcal T\mathcal M$ of the cost function $f$.
Depending on the AbstractEvaluationType
T
the gradient can have to forms
- as a function
(M, p) -> X
that allocates memory forX
, anAllocatingEvaluation
- as a function
(M, X, p) -> X
that work in place ofX
, anInplaceEvaluation
Constructors
ManifoldGradientObjective(cost, gradient; evaluation=AllocatingEvaluation())
Used with
Manopt.ManifoldAlternatingGradientObjective
β TypeManifoldAlternatingGradientObjective{E<:AbstractEvaluationType,TCost,TGradient} <: AbstractManifoldGradientObjective{E}
An alternating gradient objective consists of
- a cost function $F(x)$
- a gradient $\operatorname{grad}F$ that is either
- given as one function $\operatorname{grad}F$ returning a tangent vector
X
onM
or - an array of gradient functions $\operatorname{grad}F_i$,
Γ¬=1,β¦,n
s each returning a component of the gradient
- given as one function $\operatorname{grad}F$ returning a tangent vector
This Objective is usually defined using the ProductManifold
from Manifolds.jl
, so Manifolds.jl
to be loaded.
Constructors
ManifoldAlternatingGradientObjective(F, gradF::Function;
evaluation=AllocatingEvaluation()
)
ManifoldAlternatingGradientObjective(F, gradF::AbstractVector{<:Function};
evaluation=AllocatingEvaluation()
)
Create a alternating gradient problem with an optional cost
and the gradient either as one function (returning an array) or a vector of functions.
Manopt.ManifoldStochasticGradientObjective
β TypeManifoldStochasticGradientObjective{T<:AbstractEvaluationType} <: AbstractManifoldGradientObjective{T}
A stochastic gradient objective consists of
- a(n optional) cost function ``f(p) = \displaystyle\sum{i=1}^n fi(p)
- an array of gradients, $\operatorname{grad}f_i(p), i=1,\ldots,n$ which can be given in two forms
- as one single function $(\mathcal M, p) β¦ (X_1,β¦,X_n) β (T_p\mathcal M)^n$
- as a vector of functions $\bigl( (\mathcal M, p) β¦ X_1, β¦, (\mathcal M, p) β¦ X_n\bigr)$.
Where both variants can also be provided as InplaceEvaluation
functions (M, X, p) -> X
, where X
is the vector of X1,...Xn
and (M, X1, p) -> X1, ..., (M, Xn, p) -> Xn
, respectively.
Constructors
ManifoldStochasticGradientObjective(
grad_f::Function;
cost=Missing(),
evaluation=AllocatingEvaluation()
)
ManifoldStochasticGradientObjective(
grad_f::AbstractVector{<:Function};
cost=Missing(), evaluation=AllocatingEvaluation()
)
Create a Stochastic gradient problem with the gradient either as one function (returning an array of tangent vectors) or a vector of functions (each returning one tangent vector).
The optional cost can also be given as either a single function (returning a number) pr a vector of functions, each returning a value.
Used with
Note that this can also be used with a gradient_descent
, since the (complete) gradient is just the sums of the single gradients.
Manopt.NonlinearLeastSquaresObjective
β TypeNonlinearLeastSquaresObjective{T<:AbstractEvaluationType} <: AbstractManifoldObjective{T}
A type for nonlinear least squares problems. T
is a AbstractEvaluationType
for the F
and Jacobian functions.
Specify a nonlinear least squares problem
Fields
f
a function $f: \mathcal M β β^d$ to minimizejacobian!!
Jacobian of the function $f$jacobian_tangent_basis
the basis of tangent space used for computing the Jacobian.num_components
number of values returned byf
(equal tod
).
Depending on the AbstractEvaluationType
T
the function $F$ has to be provided:
- as a functions
(M::AbstractManifold, p) -> v
that allocates memory forv
itself for anAllocatingEvaluation
, - as a function
(M::AbstractManifold, v, p) -> v
that works in place ofv
for aInplaceEvaluation
.
Also the Jacobian $jacF!!$ is required:
- as a functions
(M::AbstractManifold, p; basis_domain::AbstractBasis) -> v
that allocates memory forv
itself for anAllocatingEvaluation
, - as a function
(M::AbstractManifold, v, p; basis_domain::AbstractBasis) -> v
that works in place ofv
for anInplaceEvaluation
.
Constructors
NonlinearLeastSquaresProblem(M, F, jacF, num_components; evaluation=AllocatingEvaluation(), jacobian_tangent_basis=DefaultOrthonormalBasis())
See also
There is also a second variant, if just one function is responsible for computing the cost and the gradient
Manopt.ManifoldCostGradientObjective
β TypeManifoldCostGradientObjective{T} <: AbstractManifoldObjective{T}
specify an objective containing one function to perform a combined computation of cost and its gradient
Fields
costgrad!!
: a function that computes both the cost $f: \mathcal M β β$ and its gradient $\operatorname{grad}f: \mathcal M β \mathcal T\mathcal M$
Depending on the AbstractEvaluationType
T
the gradient can have to forms
- as a function
(M, p) -> (c, X)
that allocates memory for the gradientX
, anAllocatingEvaluation
- as a function
(M, X, p) -> (c, X)
that work in place ofX
, anInplaceEvaluation
Constructors
ManifoldCostGradientObjective(costgrad; evaluation=AllocatingEvaluation())
Used with
Access functions
Manopt.get_gradient
β FunctionX = get_gradient(M::ProductManifold, ago::ManifoldAlternatingGradientObjective, p)
get_gradient!(M::ProductManifold, P::ManifoldAlternatingGradientObjective, X, p)
Evaluate all summands gradients at a point p
on the ProductManifold M
(in place of X
)
X = get_gradient(M::AbstractManifold, p::ManifoldAlternatingGradientObjective, p, k)
get_gradient!(M::AbstractManifold, p::ManifoldAlternatingGradientObjective, X, p, k)
Evaluate one of the component gradients $\operatorname{grad}f_k$, $kβ\{1,β¦,n\}$, at x
(in place of Y
).
get_gradient(s::AbstractManoptSolverState)
return the (last stored) gradient within AbstractManoptSolverState
s`. By default also undecorates the state beforehand
get_gradient(amp::AbstractManoptProblem, p)
get_gradient!(amp::AbstractManoptProblem, X, p)
evaluate the gradient of an AbstractManoptProblem
amp
at the point p
.
The evaluation is done in place of X
for the !
-variant.
get_gradient(M::AbstractManifold, mgo::AbstractManifoldGradientObjective{T}, p)
get_gradient!(M::AbstractManifold, X, mgo::AbstractManifoldGradientObjective{T}, p)
evaluate the gradient of a AbstractManifoldGradientObjective{T}
mgo
at p
.
The evaluation is done in place of X
for the !
-variant. The T=
AllocatingEvaluation
problem might still allocate memory within. When the non-mutating variant is called with a T=
InplaceEvaluation
memory for the result is allocated.
Note that the order of parameters follows the philosophy of Manifolds.jl
, namely that even for the mutating variant, the manifold is the first parameter and the (in-place) tangent vector X
comes second.
get_gradient(agst::AbstractGradientSolverState)
return the gradient stored within gradient options. THe default returns agst.X
.
get_gradient(TpM, trmo::TrustRegionModelObjective, X)
Evaluate the gradient of the TrustRegionModelObjective
\[\operatorname{grad} m(X) = \operatorname{grad} f(p) + \operatorname{Hess} f(p)[X].\]
get_gradient(TpM, trmo::AdaptiveRagularizationWithCubicsModelObjective, X)
Evaluate the gradient of the AdaptiveRagularizationWithCubicsModelObjective
\[\operatorname{grad} m(X) = \operatorname{grad} f(p) + \operatorname{Hess} f(p)[X] + Ο\lVert X \rVert X,\]
at X
, cf. Eq. (37) in [ABBC20].
get_gradient(M::AbstractManifold, sgo::ManifoldStochasticGradientObjective, p, k)
get_gradient!(M::AbstractManifold, sgo::ManifoldStochasticGradientObjective, Y, p, k)
Evaluate one of the summands gradients $\operatorname{grad}f_k$, $kβ\{1,β¦,n\}$, at x
(in place of Y
).
If you use a single function for the stochastic gradient, that works in-place, then get_gradient
is not available, since the length (or number of elements of the gradient required for allocation) can not be determined.
get_gradient(M::AbstractManifold, sgo::ManifoldStochasticGradientObjective, p)
get_gradient!(M::AbstractManifold, sgo::ManifoldStochasticGradientObjective, X, p)
Evaluate the complete gradient $\operatorname{grad} f = \displaystyle\sum_{i=1}^n \operatorname{grad} f_i(p)$ at p
(in place of X
).
If you use a single function for the stochastic gradient, that works in-place, then get_gradient
is not available, since the length (or number of elements of the gradient required for allocation) can not be determined.
get_gradient(M::AbstractManifold, emo::EmbeddedManifoldObjective, p)
get_gradient!(M::AbstractManifold, X, emo::EmbeddedManifoldObjective, p)
Evaluate the gradient function of an objective defined in the embedding, that is embed p
before calling the gradient function stored in the EmbeddedManifoldObjective
.
The returned gradient is then converted to a Riemannian gradient calling riemannian_gradient
.
Manopt.get_gradients
β Functionget_gradients(M::AbstractManifold, sgo::ManifoldStochasticGradientObjective, p)
get_gradients!(M::AbstractManifold, X, sgo::ManifoldStochasticGradientObjective, p)
Evaluate all summands gradients $\{\operatorname{grad}f_i\}_{i=1}^n$ at p
(in place of X
).
If you use a single function for the stochastic gradient, that works in-place, then get_gradient
is not available, since the length (or number of elements of the gradient) can not be determined.
and internally
Manopt.get_gradient_function
β Functionget_gradient_function(amgo::AbstractManifoldGradientObjective, recursive=false)
return the function to evaluate (just) the gradient $\operatorname{grad} f(p)$, where either the gradient function using the decorator or without the decorator is used.
By default recursive
is set to false
, since usually to just pass the gradient function somewhere, one still wants for example the cached one or the one that still counts calls.
Depending on the AbstractEvaluationType
E
this is a function
(M, p) -> X
for theAllocatingEvaluation
case(M, X, p) -> X
for theInplaceEvaluation
working in-place ofX
.
Internal helpers
Manopt.get_gradient_from_Jacobian!
β Functionget_gradient_from_Jacobian!(
M::AbstractManifold,
X,
nlso::NonlinearLeastSquaresObjective{InplaceEvaluation},
p,
Jval=zeros(nlso.num_components, manifold_dimension(M)),
)
Compute gradient of NonlinearLeastSquaresObjective
nlso
at point p
in place of X
, with temporary Jacobian stored in the optional argument Jval
.
Subgradient objective
Manopt.ManifoldSubgradientObjective
β TypeManifoldSubgradientObjective{T<:AbstractEvaluationType,C,S} <:AbstractManifoldCostObjective{T, C}
A structure to store information about a objective for a subgradient based optimization problem
Fields
cost
: the function $f$ to be minimizedsubgradient
: a function returning a subgradient $βf$ of $f$
Constructor
ManifoldSubgradientObjective(f, βf)
Generate the ManifoldSubgradientObjective
for a subgradient objective, consisting of a (cost) function f(M, p)
and a function βf(M, p)
that returns a not necessarily deterministic element from the subdifferential at p
on a manifold M
.
Access functions
Manopt.get_subgradient
β Functionget_subgradient(amp::AbstractManoptProblem, p)
get_subgradient!(amp::AbstractManoptProblem, X, p)
evaluate the subgradient of an AbstractManoptProblem
amp
at point p
.
The evaluation is done in place of X
for the !
-variant. The result might not be deterministic, one element of the subdifferential is returned.
X = get_subgradient(M;;AbstractManifold, sgo::ManifoldSubgradientObjective, p)
get_subgradient!(M;;AbstractManifold, X, sgo::ManifoldSubgradientObjective, p)
Evaluate the (sub)gradient of a ManifoldSubgradientObjective
sgo
at the point p
.
The evaluation is done in place of X
for the !
-variant. The result might not be deterministic, one element of the subdifferential is returned.
Proximal map objective
Manopt.ManifoldProximalMapObjective
β TypeManifoldProximalMapObjective{E<:AbstractEvaluationType, TC, TP, V <: Vector{<:Integer}} <: AbstractManifoldCostObjective{E, TC}
specify a problem for solvers based on the evaluation of proximal maps.
Fields
cost
- a function $F:\mathcal Mββ$ to minimizeproxes
- proximal maps $\operatorname{prox}_{Ξ»\varphi}:\mathcal Mβ\mathcal M$ as functions(M, Ξ», p) -> q
.number_of_proxes
- (ones(length(proxes))
` number of proximal maps per function, to specify when one of the maps is a combined one such that the proximal maps functions return more than one entry per function, you have to adapt this value. if not specified, it is set to one prox per function.
See also
Access functions
Manopt.get_proximal_map
β Functionq = get_proximal_map(M::AbstractManifold, mpo::ManifoldProximalMapObjective, Ξ», p)
get_proximal_map!(M::AbstractManifold, q, mpo::ManifoldProximalMapObjective, Ξ», p)
q = get_proximal_map(M::AbstractManifold, mpo::ManifoldProximalMapObjective, Ξ», p, i)
get_proximal_map!(M::AbstractManifold, q, mpo::ManifoldProximalMapObjective, Ξ», p, i)
evaluate the (i
th) proximal map of ManifoldProximalMapObjective p
at the point p
of p.M
with parameter $Ξ»>0$.
Hessian objective
Manopt.AbstractManifoldHessianObjective
β TypeAbstractManifoldHessianObjective{T<:AbstractEvaluationType,TC,TG,TH} <: AbstractManifoldGradientObjective{T,TC,TG}
An abstract type for all objectives that provide a (full) Hessian, where T
is a AbstractEvaluationType
for the gradient and Hessian functions.
Manopt.ManifoldHessianObjective
β TypeManifoldHessianObjective{T<:AbstractEvaluationType,C,G,H,Pre} <: AbstractManifoldHessianObjective{T,C,G,H}
specify a problem for Hessian based algorithms.
Fields
cost
: a function $f:\mathcal Mββ$ to minimizegradient
: the gradient $\operatorname{grad}f:\mathcal M β \mathcal T\mathcal M$ of the cost function $f$hessian
: the Hessian $\operatorname{Hess}f(x)[β ]: \mathcal T_{x} \mathcal M β \mathcal T_{x} \mathcal M$ of the cost function $f$preconditioner
: the symmetric, positive definite preconditioner as an approximation of the inverse of the Hessian of $f$, a map with the same input variables as thehessian
to numerically stabilize iterations when the Hessian is ill-conditioned
Depending on the AbstractEvaluationType
T
the gradient and can have to forms
- as a function
(M, p) -> X
and(M, p, X) -> Y
, resp., anAllocatingEvaluation
- as a function
(M, X, p) -> X
and (M, Y, p, X), resp., anInplaceEvaluation
Constructor
ManifoldHessianObjective(f, grad_f, Hess_f, preconditioner = (M, p, X) -> X;
evaluation=AllocatingEvaluation())
See also
Access functions
Manopt.get_hessian
β FunctionY = get_hessian(amp::AbstractManoptProblem{T}, p, X)
get_hessian!(amp::AbstractManoptProblem{T}, Y, p, X)
evaluate the Hessian of an AbstractManoptProblem
amp
at p
applied to a tangent vector X
, computing $\operatorname{Hess}f(q)[X]$, which can also happen in-place of Y
.
get_hessian(TpM, trmo::TrustRegionModelObjective, X)
Evaluate the Hessian of the TrustRegionModelObjective
\[\operatorname{Hess} m(X)[Y] = \operatorname{Hess} f(p)[Y].\]
get_hessian(M::AbstractManifold, emo::EmbeddedManifoldObjective, p, X)
get_hessian!(M::AbstractManifold, Y, emo::EmbeddedManifoldObjective, p, X)
Evaluate the Hessian of an objective defined in the embedding, that is embed p
and X
before calling the Hessian function stored in the EmbeddedManifoldObjective
.
The returned Hessian is then converted to a Riemannian Hessian calling riemannian_Hessian
.
Manopt.get_preconditioner
β Functionget_preconditioner(amp::AbstractManoptProblem, p, X)
evaluate the symmetric, positive definite preconditioner (approximation of the inverse of the Hessian of the cost function f
) of a AbstractManoptProblem
amp
s objective at the point p
applied to a tangent vector X
.
get_preconditioner(M::AbstractManifold, mho::ManifoldHessianObjective, p, X)
evaluate the symmetric, positive definite preconditioner (approximation of the inverse of the Hessian of the cost function F
) of a ManifoldHessianObjective
mho
at the point p
applied to a tangent vector X
.
and internally
Manopt.get_hessian_function
β Functionget_gradient_function(amgo::AbstractManifoldGradientObjective{E<:AbstractEvaluationType})
return the function to evaluate (just) the Hessian $\operatorname{Hess} f(p)$. Depending on the AbstractEvaluationType
E
this is a function
(M, p, X) -> Y
for theAllocatingEvaluation
case(M, Y, p, X) -> X
for theInplaceEvaluation
, working in-place ofY
.
Primal-dual based objectives
Manopt.AbstractPrimalDualManifoldObjective
β TypeAbstractPrimalDualManifoldObjective{E<:AbstractEvaluationType,C,P} <: AbstractManifoldCostObjective{E,C}
A common abstract super type for objectives that consider primal-dual problems.
Manopt.PrimalDualManifoldObjective
β TypePrimalDualManifoldObjective{T<:AbstractEvaluationType} <: AbstractPrimalDualManifoldObjective{T}
Describes an Objective linearized or exact Chambolle-Pock algorithm, cf. [BHS+21], [CP11]
Fields
All fields with !!
can either be in-place or allocating functions, which should be set depending on the evaluation=
keyword in the constructor and stored in T <: AbstractEvaluationType
.
cost
: $F + G(Ξ(β ))$ to evaluate interim cost function valueslinearized_forward_operator!!
: linearized operator for the forward operation in the algorithm $DΞ$linearized_adjoint_operator!!
: the adjoint differential $(DΞ)^* : \mathcal N β T\mathcal M$prox_f!!
: the proximal map belonging to $f$prox_G_dual!!
: the proximal map belonging to $g_n^*$Ξ!!
: (fordward_operator
) the forward operator (if given) $Ξ: \mathcal M β \mathcal N$
Either the linearized operator $DΞ$ or $Ξ$ are required usually.
Constructor
PrimalDualManifoldObjective(cost, prox_f, prox_G_dual, adjoint_linearized_operator;
linearized_forward_operator::Union{Function,Missing}=missing,
Ξ::Union{Function,Missing}=missing,
evaluation::AbstractEvaluationType=AllocatingEvaluation()
)
The last optional argument can be used to provide the 4 or 5 functions as allocating or mutating (in place computation) ones. Note that the first argument is always the manifold under consideration, the mutated one is the second.
Manopt.PrimalDualManifoldSemismoothNewtonObjective
β TypePrimalDualManifoldSemismoothNewtonObjective{E<:AbstractEvaluationType, TC, LO, ALO, PF, DPF, PG, DPG, L} <: AbstractPrimalDualManifoldObjective{E, TC, PF}
Describes a Problem for the Primal-dual Riemannian semismooth Newton algorithm. [DL21]
Fields
cost
: $F + G(Ξ(β ))$ to evaluate interim cost function valueslinearized_operator
: the linearization $DΞ(β )[β ]$ of the operator $Ξ(β )$.linearized_adjoint_operator
: the adjoint differential $(DΞ)^* : \mathcal N β T\mathcal M$prox_F
: the proximal map belonging to $F$diff_prox_F
: the (Clarke Generalized) differential of the proximal maps of $F$prox_G_dual
: the proximal map belonging to $g_n^*$diff_prox_dual_G
: the (Clarke Generalized) differential of the proximal maps of $G^\ast_n$Ξ
: the exact forward operator. This operator is required ifΞ(m)=n
does not hold.
Constructor
PrimalDualManifoldSemismoothNewtonObjective(cost, prox_F, prox_G_dual, forward_operator, adjoint_linearized_operator,Ξ)
Access functions
Manopt.adjoint_linearized_operator
β FunctionX = adjoint_linearized_operator(N::AbstractManifold, apdmo::AbstractPrimalDualManifoldObjective, m, n, Y)
adjoint_linearized_operator(N::AbstractManifold, X, apdmo::AbstractPrimalDualManifoldObjective, m, n, Y)
Evaluate the adjoint of the linearized forward operator of $(DΞ(m))^*[Y]$ stored within the AbstractPrimalDualManifoldObjective
(in place of X
). Since $YβT_n\mathcal N$, both $m$ and $n=Ξ(m)$ are necessary arguments, mainly because the forward operator $Ξ$ might be missing
in p
.
Manopt.forward_operator
β Functionq = forward_operator(M::AbstractManifold, N::AbstractManifold, apdmo::AbstractPrimalDualManifoldObjective, p)
forward_operator!(M::AbstractManifold, N::AbstractManifold, q, apdmo::AbstractPrimalDualManifoldObjective, p)
Evaluate the forward operator of $Ξ(x)$ stored within the TwoManifoldProblem
(in place of q
).
Manopt.get_differential_dual_prox
β FunctionΞ· = get_differential_dual_prox(N::AbstractManifold, pdsno::PrimalDualManifoldSemismoothNewtonObjective, n, Ο, X, ΞΎ)
get_differential_dual_prox!(N::AbstractManifold, pdsno::PrimalDualManifoldSemismoothNewtonObjective, Ξ·, n, Ο, X, ΞΎ)
Evaluate the differential proximal map of $G_n^*$ stored within PrimalDualManifoldSemismoothNewtonObjective
\[D\operatorname{prox}_{ΟG_n^*}(X)[ΞΎ]\]
which can also be computed in place of Ξ·
.
Manopt.get_differential_primal_prox
β Functiony = get_differential_primal_prox(M::AbstractManifold, pdsno::PrimalDualManifoldSemismoothNewtonObjective Ο, x)
get_differential_primal_prox!(p::TwoManifoldProblem, y, Ο, x)
Evaluate the differential proximal map of $F$ stored within AbstractPrimalDualManifoldObjective
\[D\operatorname{prox}_{ΟF}(x)[X]\]
which can also be computed in place of y
.
Manopt.get_dual_prox
β FunctionY = get_dual_prox(N::AbstractManifold, apdmo::AbstractPrimalDualManifoldObjective, n, Ο, X)
get_dual_prox!(N::AbstractManifold, apdmo::AbstractPrimalDualManifoldObjective, Y, n, Ο, X)
Evaluate the proximal map of $g_n^*$ stored within AbstractPrimalDualManifoldObjective
\[ Y = \operatorname{prox}_{ΟG_n^*}(X)\]
which can also be computed in place of Y
.
Manopt.get_primal_prox
β Functionq = get_primal_prox(M::AbstractManifold, p::AbstractPrimalDualManifoldObjective, Ο, p)
get_primal_prox!(M::AbstractManifold, p::AbstractPrimalDualManifoldObjective, q, Ο, p)
Evaluate the proximal map of $F$ stored within AbstractPrimalDualManifoldObjective
\[\operatorname{prox}_{ΟF}(x)\]
which can also be computed in place of y
.
Manopt.linearized_forward_operator
β FunctionY = linearized_forward_operator(M::AbstractManifold, N::AbstractManifold, apdmo::AbstractPrimalDualManifoldObjective, m, X, n)
linearized_forward_operator!(M::AbstractManifold, N::AbstractManifold, Y, apdmo::AbstractPrimalDualManifoldObjective, m, X, n)
Evaluate the linearized operator (differential) $DΞ(m)[X]$ stored within the AbstractPrimalDualManifoldObjective
(in place of Y
), where n = Ξ(m)
.
Constrained objective
Besides the AbstractEvaluationType
there is one further property to distinguish among constraint functions, especially the gradients of the constraints.
Manopt.ConstraintType
β TypeConstraintType
An abstract type to represent different forms of representing constraints
Manopt.FunctionConstraint
β TypeFunctionConstraint <: ConstraintType
A type to indicate that constraints are implemented one whole functions, for example $g(p) β β^m$.
Manopt.VectorConstraint
β TypeVectorConstraint <: ConstraintType
A type to indicate that constraints are implemented a vector of functions, for example $g_i(p) β β, i=1,β¦,m$.
The ConstraintType
is a parameter of the corresponding Objective.
Manopt.ConstrainedManifoldObjective
β TypeConstrainedManifoldObjective{T<:AbstractEvaluationType, C <: ConstraintType Manifold} <: AbstractManifoldObjective{T}
Describes the constrained objective
\[\begin{aligned} \operatorname*{arg\,min}_{p β\mathcal{M}} & f(p)\\ \text{subject to } &g_i(p)\leq0 \quad \text{ for all } i=1,β¦,m,\\ \quad &h_j(p)=0 \quad \text{ for all } j=1,β¦,n. \end{aligned}\]
Fields
cost
the cost $f$`gradient!!
the gradient of the cost $f$`g
the inequality constraintsgrad_g!!
the gradient of the inequality constraintsh
the equality constraintsgrad_h!!
the gradient of the equality constraints
It consists of
- an cost function $f(p)$
- the gradient of $f$, $\operatorname{grad}f(p)$
- inequality constraints $g(p)$, either a function
g
returning a vector or a vector[g1, g2, ..., gm]
of functions. - equality constraints $h(p)$, either a function
h
returning a vector or a vector[h1, h2, ..., hn]
of functions. - gradients of the inequality constraints $\operatorname{grad}g(p) β (T_p\mathcal M)^m$, either a function or a vector of functions.
- gradients of the equality constraints $\operatorname{grad}h(p) β (T_p\mathcal M)^n$, either a function or a vector of functions.
There are two ways to specify the constraints $g$ and $h$.
- as one
Function
returning a vector in $β^m$ and $β^n$ respectively. This might be easier to implement but requires evaluating all constraints even if only one is needed. - as a
AbstractVector{<:Function}
where each function returns a real number. This requires each constraint to be implemented as a single function, but it is possible to evaluate also only a single constraint.
The gradients $\operatorname{grad}g$, $\operatorname{grad}h$ have to follow the same form. Additionally they can be implemented as in-place functions or as allocating ones. The gradient $\operatorname{grad}F$ has to be the same kind. This difference is indicated by the evaluation
keyword.
Constructors
ConstrainedManifoldObjective(f, grad_f, g, grad_g, h, grad_h;
evaluation=AllocatingEvaluation()
)
Where f, g, h
describe the cost, inequality and equality constraints, respectively, as described previously and grad_f, grad_g, grad_h
are the corresponding gradient functions in one of the 4 formats. If the objective does not have inequality constraints, you can set G
and gradG
no nothing
. If the problem does not have equality constraints, you can set H
and gradH
no nothing
or leave them out.
ConstrainedManifoldObjective(M::AbstractManifold, F, gradF;
G=nothing, gradG=nothing, H=nothing, gradH=nothing;
evaluation=AllocatingEvaluation()
)
A keyword argument variant of the preceding constructor, where you can leave out either G
and gradG
or H
and gradH
but not both pairs.
Access functions
Manopt.get_constraints
β Functionget_constraints(M::AbstractManifold, co::ConstrainedManifoldObjective, p)
Return the vector $(g_1(p),...g_m(p),h_1(p),...,h_n(p))$ from the ConstrainedManifoldObjective
P
containing the values of all constraints at p
.
get_constraints(M::AbstractManifold, emo::EmbeddedManifoldObjective, p)
Return the vector $(g_1(p),...g_m(p),h_1(p),...,h_n(p))$ defined in the embedding, that is embed p
before calling the constraint functions stored in the EmbeddedManifoldObjective
.
Manopt.get_equality_constraint
β Functionget_equality_constraint(M::AbstractManifold, co::ConstrainedManifoldObjective, p, j)
evaluate the j
th equality constraint $(h(p))_j$ or $h_j(p)$.
For the FunctionConstraint
representation this still evaluates all constraints.
get_equality_constraint(M::AbstractManifold, emo::EmbeddedManifoldObjective, p, j)
evaluate the j
s equality constraint $h_j(p)$ defined in the embedding, that is embed p
before calling the constraint functions stored in the EmbeddedManifoldObjective
.
Manopt.get_equality_constraints
β Functionget_equality_constraints(M::AbstractManifold, co::ConstrainedManifoldObjective, p)
evaluate all equality constraints $h(p)$ of $\bigl(h_1(p), h_2(p),\ldots,h_p(p)\bigr)$ of the ConstrainedManifoldObjective
$P$ at $p$.
get_equality_constraints(M::AbstractManifold, emo::EmbeddedManifoldObjective, p)
Evaluate all equality constraints $h(p)$ of $\bigl(h_1(p), h_2(p),\ldots,h_p(p)\bigr)$ defined in the embedding, that is embed p
before calling the constraint functions stored in the EmbeddedManifoldObjective
.
Manopt.get_inequality_constraint
β Functionget_inequality_constraint(M::AbstractManifold, co::ConstrainedManifoldObjective, p, i)
evaluate one equality constraint $(g(p))_i$ or $g_i(p)$.
For the FunctionConstraint
representation this still evaluates all constraints.
get_inequality_constraint(M::AbstractManifold, ems::EmbeddedManifoldObjective, p, i)
Evaluate the i
s inequality constraint $g_i(p)$ defined in the embedding, that is embed p
before calling the constraint functions stored in the EmbeddedManifoldObjective
.
Manopt.get_inequality_constraints
β Functionget_inequality_constraints(M::AbstractManifold, co::ConstrainedManifoldObjective, p)
Evaluate all inequality constraints $g(p)$ or $\bigl(g_1(p), g_2(p),\ldots,g_m(p)\bigr)$ of the ConstrainedManifoldObjective
$P$ at $p$.
get_inequality_constraints(M::AbstractManifold, ems::EmbeddedManifoldObjective, p)
Evaluate all inequality constraints $g(p)$ of $\bigl(g_1(p), g_2(p),\ldots,g_m(p)\bigr)$ defined in the embedding, that is embed p
before calling the constraint functions stored in the EmbeddedManifoldObjective
.
Manopt.get_grad_equality_constraint
β Functionget_grad_equality_constraint(M::AbstractManifold, co::ConstrainedManifoldObjective, p, j)
evaluate the gradient of the j
th equality constraint $(\operatorname{grad} h(p))_j$ or $\operatorname{grad} h_j(x)$.
For the FunctionConstraint
variant of the problem, this function still evaluates the full gradient. For the InplaceEvaluation
and FunctionConstraint
of the problem, this function currently also calls get_equality_constraints
, since this is the only way to determine the number of constraints. It also allocates a full tangent vector.
X = get_grad_equality_constraint(M::AbstractManifold, emo::EmbeddedManifoldObjective, p, j)
get_grad_equality_constraint!(M::AbstractManifold, X, emo::EmbeddedManifoldObjective, p, j)
evaluate the gradient of the j
th equality constraint $\operatorname{grad} h_j(p)$ defined in the embedding, that is embed p
before calling the gradient function stored in the EmbeddedManifoldObjective
.
The returned gradient is then converted to a Riemannian gradient calling riemannian_gradient
.
Manopt.get_grad_equality_constraints
β Functionget_grad_equality_constraints(M::AbstractManifold, co::ConstrainedManifoldObjective, p)
evaluate all gradients of the equality constraints $\operatorname{grad} h(x)$ or $\bigl(\operatorname{grad} h_1(x), \operatorname{grad} h_2(x),\ldots, \operatorname{grad}h_n(x)\bigr)$ of the ConstrainedManifoldObjective
P
at p
.
For the InplaceEvaluation
and FunctionConstraint
variant of the problem, this function currently also calls get_equality_constraints
, since this is the only way to determine the number of constraints.
X = get_grad_equality_constraints(M::AbstractManifold, emo::EmbeddedManifoldObjective, p)
get_grad_equality_constraints!(M::AbstractManifold, X, emo::EmbeddedManifoldObjective, p)
evaluate the gradients of theequality constraints $\operatorname{grad} h(p)$ defined in the embedding, that is embed p
before calling the gradient function stored in the EmbeddedManifoldObjective
.
The returned gradients are then converted to a Riemannian gradient calling riemannian_gradient
.
Manopt.get_grad_equality_constraints!
β Functionget_grad_equality_constraints!(M::AbstractManifold, X, co::ConstrainedManifoldObjective, p)
evaluate all gradients of the equality constraints $\operatorname{grad} h(p)$ or $\bigl(\operatorname{grad} h_1(p), \operatorname{grad} h_2(p),\ldots,\operatorname{grad} h_n(p)\bigr)$ of the ConstrainedManifoldObjective
$P$ at $p$ in place of X
, which is a vector of
n
` tangent vectors.
Manopt.get_grad_equality_constraint!
β Functionget_grad_equality_constraint!(M::AbstractManifold, X, co::ConstrainedManifoldObjective, p, j)
Evaluate the gradient of the j
th equality constraint $(\operatorname{grad} h(x))_j$ or $\operatorname{grad} h_j(x)$ in place of $X$
For the FunctionConstraint
variant of the problem, this function still evaluates the full gradient. For the InplaceEvaluation
of the FunctionConstraint
of the problem, this function currently also calls get_inequality_constraints
, since this is the only way to determine the number of constraints and allocates a full vector of tangent vectors
Manopt.get_grad_inequality_constraint
β Functionget_grad_inequality_constraint(M::AbstractManifold, co::ConstrainedManifoldObjective, p, i)
Evaluate the gradient of the i
th inequality constraints $(\operatorname{grad} g(x))_i$ or $\operatorname{grad} g_i(x)$.
For the FunctionConstraint
variant of the problem, this function still evaluates the full gradient. For the InplaceEvaluation
and FunctionConstraint
of the problem, this function currently also calls get_inequality_constraints
, since this is the only way to determine the number of constraints.
X = get_grad_inequality_constraint(M::AbstractManifold, emo::EmbeddedManifoldObjective, p, i)
get_grad_inequality_constraint!(M::AbstractManifold, X, emo::EmbeddedManifoldObjective, p, i)
evaluate the gradient of the i
th inequality constraint $\operatorname{grad} g_i(p)$ defined in the embedding, that is embed p
before calling the gradient function stored in the EmbeddedManifoldObjective
.
The returned gradient is then converted to a Riemannian gradient calling riemannian_gradient
.
Manopt.get_grad_inequality_constraint!
β Functionget_grad_inequality_constraint!(P, X, p, i)
Evaluate the gradient of the i
th inequality constraints $(\operatorname{grad} g(x))_i$ or $\operatorname{grad} g_i(x)$ of the ConstrainedManifoldObjective
P
in place of $X$
For the FunctionConstraint
variant of the problem, this function still evaluates the full gradient. For the InplaceEvaluation
and FunctionConstraint
of the problem, this function currently also calls get_inequality_constraints
,
since this is the only way to determine the number of constraints. evaluate all gradients of the inequality constraints $\operatorname{grad} h(x)$ or $\bigl(g_1(x), g_2(x),\ldots,g_m(x)\bigr)$ of the ConstrainedManifoldObjective
$p$ at $x$ in place of X
, which is a vector of
m
` tangent vectors .
Manopt.get_grad_inequality_constraints
β Functionget_grad_inequality_constraints(M::AbstractManifold, co::ConstrainedManifoldObjective, p)
evaluate all gradients of the inequality constraints $\operatorname{grad} g(p)$ or $\bigl(\operatorname{grad} g_1(p), \operatorname{grad} g_2(p),β¦,\operatorname{grad} g_m(p)\bigr)$ of the ConstrainedManifoldObjective
$P$ at $p$.
for the InplaceEvaluation
and FunctionConstraint
variant of the problem, this function currently also calls get_equality_constraints
, since this is the only way to determine the number of constraints.
X = get_grad_inequality_constraints(M::AbstractManifold, emo::EmbeddedManifoldObjective, p)
get_grad_inequality_constraints!(M::AbstractManifold, X, emo::EmbeddedManifoldObjective, p)
evaluate the gradients of theinequality constraints $\operatorname{grad} g(p)$ defined in the embedding, that is embed p
before calling the gradient function stored in the EmbeddedManifoldObjective
.
The returned gradients are then converted to a Riemannian gradient calling riemannian_gradient
.
Manopt.get_grad_inequality_constraints!
β Functionget_grad_inequality_constraints!(M::AbstractManifold, X, co::ConstrainedManifoldObjective, p)
evaluate all gradients of the inequality constraints $\operatorname{grad} g(x)$ or $\bigl(\operatorname{grad} g_1(x), \operatorname{grad} g_2(x),\ldots,\operatorname{grad} g_m(x)\bigr)$ of the ConstrainedManifoldObjective
P
at p
in place of X
, which is a vector of $m$ tangent vectors.
Subproblem objective
This objective can be use when the objective of a sub problem solver still needs access to the (outer/main) objective.
Manopt.AbstractManifoldSubObjective
β TypeAbstractManifoldSubObjective{O<:AbstractManifoldObjective} <: AbstractManifoldObjective
An abstract type for objectives of sub problems within a solver but still store the original objective internally to generate generic objectives for sub solvers.
Access functions
Manopt.get_objective_cost
β Functionget_objective_cost(M, amso::AbstractManifoldSubObjective, p)
Evaluate the cost of the (original) objective stored within the sub objective.
Manopt.get_objective_gradient
β FunctionX = get_objective_gradient(M, amso::AbstractManifoldSubObjective, p)
get_objective_gradient!(M, X, amso::AbstractManifoldSubObjective, p)
Evaluate the gradient of the (original) objective stored within the sub objective amso
.
Manopt.get_objective_hessian
β FunctionY = get_objective_Hessian(M, amso::AbstractManifoldSubObjective, p, X)
get_objective_Hessian!(M, Y, amso::AbstractManifoldSubObjective, p, X)
Evaluate the Hessian of the (original) objective stored within the sub objective amso
.
Manopt.get_objective_preconditioner
β FunctionY = get_objective_preconditioner(M, amso::AbstractManifoldSubObjective, p, X)
get_objective_preconditioner(M, Y, amso::AbstractManifoldSubObjective, p, X)
Evaluate the Hessian of the (original) objective stored within the sub objective amso
.
- 1This cache requires
LRUCache.jl
to be loaded as well.