Alternating gradient descent
Manopt.alternating_gradient_descent — Functionalternating_gradient_descent(M::ProductManifold, f, grad_f, p=rand(M))
alternating_gradient_descent(M::ProductManifold, ago::ManifoldAlternatingGradientObjective, p)
alternating_gradient_descent!(M::ProductManifold, f, grad_f, p)
alternating_gradient_descent!(M::ProductManifold, ago::ManifoldAlternatingGradientObjective, p)perform an alternating gradient descent. This can be done in-place of the start point p
Input
- M::- AbstractManifold- : a Riemannian manifold $\mathcal M$
- f: a cost function $f: \mathcal M→ ℝ$ implemented as- (M, p) -> v
- grad_f: a gradient, that can be of two cases- is a single function returning an ArrayPartitionfromRecursiveArrayTools.jlor
- is a vector functions each returning a component part of the whole gradient
 
- is a single function returning an 
- p: a point on the manifold $\mathcal M$
Keyword arguments
- evaluation=- AllocatingEvaluation- (): specify whether the functions that return an array, for example a point or a tangent vector, work by allocating its result (- AllocatingEvaluation) or whether they modify their input argument to return the result therein (- InplaceEvaluation). Since usually the first argument is the manifold, the modified argument is the second.
- evaluation_order=:Linear: whether to use a randomly permuted sequence (- :FixedRandom), a per cycle permuted sequence (- :Random) or the default- :Linearone.
- inner_iterations=5: how many gradient steps to take in a component before alternating to the next
- stopping_criterion=- StopAfterIteration- (1000)): a functor indicating that the stopping criterion is fulfilled
- stepsize=- ArmijoLinesearch- (): a functor inheriting from- Stepsizeto determine a step size
- order=[1:n]: the initial permutation, where- nis the number of gradients in- gradF.
- retraction_method=- default_retraction_method- (M, typeof(p)): a retraction $\operatorname{retr}$ to use, see the section on retractions
Output
usually the obtained (approximate) minimizer, see get_solver_return for details
Manopt.alternating_gradient_descent! — Functionalternating_gradient_descent(M::ProductManifold, f, grad_f, p=rand(M))
alternating_gradient_descent(M::ProductManifold, ago::ManifoldAlternatingGradientObjective, p)
alternating_gradient_descent!(M::ProductManifold, f, grad_f, p)
alternating_gradient_descent!(M::ProductManifold, ago::ManifoldAlternatingGradientObjective, p)perform an alternating gradient descent. This can be done in-place of the start point p
Input
- M::- AbstractManifold- : a Riemannian manifold $\mathcal M$
- f: a cost function $f: \mathcal M→ ℝ$ implemented as- (M, p) -> v
- grad_f: a gradient, that can be of two cases- is a single function returning an ArrayPartitionfromRecursiveArrayTools.jlor
- is a vector functions each returning a component part of the whole gradient
 
- is a single function returning an 
- p: a point on the manifold $\mathcal M$
Keyword arguments
- evaluation=- AllocatingEvaluation- (): specify whether the functions that return an array, for example a point or a tangent vector, work by allocating its result (- AllocatingEvaluation) or whether they modify their input argument to return the result therein (- InplaceEvaluation). Since usually the first argument is the manifold, the modified argument is the second.
- evaluation_order=:Linear: whether to use a randomly permuted sequence (- :FixedRandom), a per cycle permuted sequence (- :Random) or the default- :Linearone.
- inner_iterations=5: how many gradient steps to take in a component before alternating to the next
- stopping_criterion=- StopAfterIteration- (1000)): a functor indicating that the stopping criterion is fulfilled
- stepsize=- ArmijoLinesearch- (): a functor inheriting from- Stepsizeto determine a step size
- order=[1:n]: the initial permutation, where- nis the number of gradients in- gradF.
- retraction_method=- default_retraction_method- (M, typeof(p)): a retraction $\operatorname{retr}$ to use, see the section on retractions
Output
usually the obtained (approximate) minimizer, see get_solver_return for details
State
Manopt.AlternatingGradientDescentState — TypeAlternatingGradientDescentState <: AbstractGradientDescentSolverStateStore the fields for an alternating gradient descent algorithm, see also alternating_gradient_descent.
Fields
- direction::- DirectionUpdateRule
- evaluation_order::Symbol: whether to use a randomly permuted sequence (- :FixedRandom), a per cycle newly permuted sequence (- :Random) or the default- :Linearevaluation order.
- inner_iterations: how many gradient steps to take in a component before alternating to the next
- order: the current permutation
- retraction_method::AbstractRetractionMethod: a retraction $\operatorname{retr}$ to use, see the section on retractions
- stepsize::Stepsize: a functor inheriting from- Stepsizeto determine a step size
- stop::StoppingCriterion: a functor indicating that the stopping criterion is fulfilled
- p::P: a point on the manifold $\mathcal M$ storing the current iterate
- X::T: a tangent vector at the point $p$ on the manifold $\mathcal M$storing the gradient at the current iterate
- k, ì`: internal counters for the outer and inner iterations, respectively.
Constructors
AlternatingGradientDescentState(M::AbstractManifold; kwargs...)Keyword arguments
- inner_iterations=5
- p=- rand- (M): a point on the manifold $\mathcal M$
- order_type::Symbol=:Linear
- order::Vector{<:Int}=Int[]
- stopping_criterion=- StopAfterIteration- (1000): a functor indicating that the stopping criterion is fulfilled
- stepsize=- default_stepsize- (M, AlternatingGradientDescentState): a functor inheriting from- Stepsizeto determine a step size
- X=- zero_vector- (M, p): a tangent vector at the point $p$ on the manifold $\mathcal M$
Generate the options for point p and where inner_iterations, order_type, order, retraction_method, stopping_criterion, and stepsize` are keyword arguments
Additionally, the options share a DirectionUpdateRule, which chooses the current component, so they can be decorated further; The most inner one should always be the following one though.
Manopt.AlternatingGradient — FunctionAlternatingGradient(; kwargs...)
AlternatingGradient(M::AbstractManifold; kwargs...)Specify that a gradient based method should only update parts of the gradient in order to do a alternating gradient descent.
Keyword arguments
- initial_gradient=- zero_vector- (M, p): a tangent vector at the point $p$ on the manifold $\mathcal M$
- p=- rand- (M): a point on the manifold $\mathcal M$ to specify the initial value
This function generates a ManifoldDefaultsFactory for AlternatingGradientRule. For default values, that depend on the manifold, this factory postpones the construction until the manifold from for example a corresponding AbstractManoptSolverState is available.
Manopt.AlternatingGradientRule — TypeAlternatingGradientRule <: AbstractGradientGroupDirectionRuleCreate a functor (problem, state k) -> (s,X) to evaluate the alternating gradient, that is alternating between the components of the gradient and has an field for partial evaluation of the gradient in-place.
Fields
- X::T: a tangent vector at the point $p$ on the manifold $\mathcal M$
Constructor
AlternatingGradientRule(M::AbstractManifold; p=rand(M), X=zero_vector(M, p))Initialize the alternating gradient processor with tangent vector type of X, where both M and p are just help variables.
See also
alternating_gradient_descent, [AlternatingGradient])@ref)
which internally uses
Technical details
The alternating_gradient_descent solver requires the following functions of a manifold to be available
- The problem has to be phrased on a ProductManifold, to be able to
alternate between parts of the input.
- A retract!(M, q, p, X); it is recommended to set thedefault_retraction_methodto a favourite retraction. If this default is set, aretraction_method=does not have to be specified.
- By default alternating gradient descent uses ArmijoLinesearchwhich requiresmax_stepsize(M)to be set and an implementation ofinner(M, p, X).
- By default the tangent vector storing the gradient is initialized calling zero_vector(M,p).