Stan  2.10.0
probability, sampling & optimization
Classes | Typedefs | Enumerations | Functions
stan::optimization Namespace Reference

Classes

class  BFGSLineSearch
 
class  BFGSMinimizer
 
class  BFGSUpdate_HInv
 
class  ConvergenceOptions
 
class  LBFGSUpdate
 Implement a limited memory version of the BFGS update. More...
 
class  LSOptions
 
class  ModelAdaptor
 

Typedefs

typedef Eigen::Matrix< double, Eigen::Dynamic, Eigen::Dynamic > matrix_d
 
typedef Eigen::Matrix< double, Eigen::Dynamic, 1 > vector_d
 

Enumerations

enum  TerminationCondition {
  TERM_SUCCESS = 0, TERM_ABSX = 10, TERM_ABSF = 20, TERM_RELF = 21,
  TERM_ABSGRAD = 30, TERM_RELGRAD = 31, TERM_MAXIT = 40, TERM_LSFAIL = -1
}
 

Functions

template<typename Scalar >
Scalar CubicInterp (const Scalar &df0, const Scalar &x1, const Scalar &f1, const Scalar &df1, const Scalar &loX, const Scalar &hiX)
 Find the minima in an interval [loX, hiX] of a cubic function which interpolates the points, function values and gradients provided. More...
 
template<typename Scalar >
Scalar CubicInterp (const Scalar &x0, const Scalar &f0, const Scalar &df0, const Scalar &x1, const Scalar &f1, const Scalar &df1, const Scalar &loX, const Scalar &hiX)
 Find the minima in an interval [loX, hiX] of a cubic function which interpolates the points, function values and gradients provided. More...
 
template<typename FunctorType , typename Scalar , typename XType >
int WolfeLineSearch (FunctorType &func, Scalar &alpha, XType &x1, Scalar &f1, XType &gradx1, const XType &p, const XType &x0, const Scalar &f0, const XType &gradx0, const Scalar &c1, const Scalar &c2, const Scalar &minAlpha)
 Perform a line search which finds an approximate solution to:

\[ \min_\alpha f(x_0 + \alpha p) \]

satisfying the strong Wolfe conditions: 1) $ f(x_0 + \alpha p) \leq f(x_0) + c_1 \alpha p^T g(x_0) $ 2) $ \vert p^T g(x_0 + \alpha p) \vert \leq c_2 \vert p^T g(x_0) \vert $ where $g(x) = \frac{\partial f}{\partial x}$ is the gradient of f(x). More...

 
template<typename M >
double newton_step (M &model, std::vector< double > &params_r, std::vector< int > &params_i, std::ostream *output_stream=0)
 

Typedef Documentation

typedef Eigen::Matrix<double, Eigen::Dynamic, Eigen::Dynamic> stan::optimization::matrix_d

Definition at line 13 of file newton.hpp.

typedef Eigen::Matrix<double, Eigen::Dynamic, 1> stan::optimization::vector_d

Definition at line 14 of file newton.hpp.

Enumeration Type Documentation

Enumerator
TERM_SUCCESS 
TERM_ABSX 
TERM_ABSF 
TERM_RELF 
TERM_ABSGRAD 
TERM_RELGRAD 
TERM_MAXIT 
TERM_LSFAIL 

Definition at line 20 of file bfgs.hpp.

Function Documentation

template<typename Scalar >
Scalar stan::optimization::CubicInterp ( const Scalar &  df0,
const Scalar &  x1,
const Scalar &  f1,
const Scalar &  df1,
const Scalar &  loX,
const Scalar &  hiX 
)

Find the minima in an interval [loX, hiX] of a cubic function which interpolates the points, function values and gradients provided.

Implicitly, this function constructs an interpolating polynomial g(x) = a_3 x^3 + a_2 x^2 + a_1 x + a_0 such that g(0) = 0, g(x1) = f1, g'(0) = df0, g'(x1) = df1 where g'(x) = 3 a_3 x^2 + 2 a_2 x + a_1 is the derivative of g(x). It then computes the roots of g'(x) and finds the minimal value of g(x) on the interval [loX,hiX] including the end points.

This function implements the full parameter version of CubicInterp().

Parameters
df0First derivative value, f'(x0)
x1Second point
f1Second function value, f(x1)
df1Second derivative value, f'(x1)
loXLower bound on the interval of solutions
hiXUpper bound on the interval of solutions

Definition at line 35 of file bfgs_linesearch.hpp.

template<typename Scalar >
Scalar stan::optimization::CubicInterp ( const Scalar &  x0,
const Scalar &  f0,
const Scalar &  df0,
const Scalar &  x1,
const Scalar &  f1,
const Scalar &  df1,
const Scalar &  loX,
const Scalar &  hiX 
)

Find the minima in an interval [loX, hiX] of a cubic function which interpolates the points, function values and gradients provided.

Implicitly, this function constructs an interpolating polynomial g(x) = a_3 x^3 + a_2 x^2 + a_1 x + a_0 such that g(x0) = f0, g(x1) = f1, g'(x0) = df0, g'(x1) = df1 where g'(x) = 3 a_3 x^2 + 2 a_2 x + a_1 is the derivative of g(x). It then computes the roots of g'(x) and finds the minimal value of g(x) on the interval [loX,hiX] including the end points.

Parameters
x0First point
f0First function value, f(x0)
df0First derivative value, f'(x0)
x1Second point
f1Second function value, f(x1)
df1Second derivative value, f'(x1)
loXLower bound on the interval of solutions
hiXUpper bound on the interval of solutions

Definition at line 103 of file bfgs_linesearch.hpp.

template<typename M >
double stan::optimization::newton_step ( M &  model,
std::vector< double > &  params_r,
std::vector< int > &  params_i,
std::ostream *  output_stream = 0 
)

Definition at line 33 of file newton.hpp.

template<typename FunctorType , typename Scalar , typename XType >
int stan::optimization::WolfeLineSearch ( FunctorType &  func,
Scalar &  alpha,
XType &  x1,
Scalar &  f1,
XType &  gradx1,
const XType &  p,
const XType &  x0,
const Scalar &  f0,
const XType &  gradx0,
const Scalar &  c1,
const Scalar &  c2,
const Scalar &  minAlpha 
)

Perform a line search which finds an approximate solution to:

\[ \min_\alpha f(x_0 + \alpha p) \]

satisfying the strong Wolfe conditions: 1) $ f(x_0 + \alpha p) \leq f(x_0) + c_1 \alpha p^T g(x_0) $ 2) $ \vert p^T g(x_0 + \alpha p) \vert \leq c_2 \vert p^T g(x_0) \vert $ where $g(x) = \frac{\partial f}{\partial x}$ is the gradient of f(x).

Template Parameters
FunctorTypeA type which supports being called as ret = func(x,f,g) where x is the input point, f and g are the function value and gradient at x and ret is non-zero if function evaluation fails.
Parameters
funcFunction which is being minimized.
alphaFirst value of $ \alpha $ to try. Upon return this contains the final value of the $ \alpha $.
x1Final point, equal to $ x_0 + \alpha p $.
f1Final point function value, equal to $ f(x_0 + \alpha p) $.
gradx1Final point gradient, equal to $ g(x_0 + \alpha p) $.
pSearch direction. It is assumed to be a descent direction such that $ p^T g(x_0) < 0 $.
x0Value of starting point, $ x_0 $.
f0Value of function at starting point, $ f(x_0) $.
gradx0Value of function gradient at starting point, $ g(x_0) $.
c1Parameter of the Wolfe conditions. $ 0 < c_1 < c_2 < 1 $ Typically c1 = 1e-4.
c2Parameter of the Wolfe conditions. $ 0 < c_1 < c_2 < 1 $ Typically c2 = 0.9.
minAlphaSmallest allowable step-size.
Returns
Returns zero on success, non-zero otherwise.

Definition at line 222 of file bfgs_linesearch.hpp.


     [ Stan Home Page ] © 2011–2016, Stan Development Team.