| Safe Haskell | None |
|---|---|
| Language | Haskell2010 |
Optimization.LineSearch.MirrorDescent
Contents
- mirrorDescent :: (Num a, Additive f) => LineSearch f a -> (f a -> f a) -> (f a -> f a) -> (f a -> f a) -> f a -> [f a]
- module Optimization.LineSearch
Documentation
Arguments
| :: (Num a, Additive f) | |
| => LineSearch f a | line search method |
| -> (f a -> f a) | strongly convex function, |
| -> (f a -> f a) | dual of |
| -> (f a -> f a) | gradient of function |
| -> f a | starting point |
| -> [f a] | iterates |
Mirror descent method.
Originally described by Nemirovsky and Yudin and later elucidated
by Beck and Teboulle, the mirror descent method is a generalization of
the projected subgradient method for convex optimization.
Mirror descent requires the gradient of a strongly
convex function psi (and its dual) as well as a way to get a
subgradient for each point of the objective function f.
Step size methods
module Optimization.LineSearch