Self-concordant function
In optimization, a self-concordant function is a function for which
or, equivalently, a function that, wherever , satisfies
and which satisfies elsewhere.
More generally, a multivariate function is self-concordant if
or, equivalently, if its restriction to any arbitrary line is self-concordant.[1]
History
As mentioned in the "Bibliography Comments"[2] of their 1994 book,[3] self-concordant functions were introduced in 1988 by Yurii Nesterov[4][5] and further developed with Arkadi Nemirovski.[6] As explained in[7] their basic observation was that the Newton method is affine invariant, in the sense that if for a function we have Newton steps then for a function where is a non-degenerate linear transformation, starting from we have the Newton steps which can be shown recursively
- .
However the standard analysis of the Newton method supposes that the Hessian of is Lipschitz continuous, that is for some constant . If we suppose that is 3 times continuously differentiable, then this is equivalent to
- for all
where . Then the left hand side of the above inequality is invariant under the affine transformation , however the right hand side is not.
The authors note that the right hand side can be made also invariant if we replace the Euclidean metric by the scalar product defined by the Hessian of defined as for . They then arrive at the definition of a self concordant function as
- .
Properties
Linear combination
If and are self-concordant with constants and and , then is self-concordant with constant .
Affine transformation
If is self-concordant with constant and is an affine transformation of , then is also self-concordant with parameter .
Non-singular Hessian
If is self-concordant and the domain of contains no straight line (infinite in both directions), then is non-singular.
Conversely, if for some in the domain of and we have , then for all for which is in the domain of and then is linear and cannot have a maximum so all of is in the domain of . We note also that cannot have a minimum inside its domain.
Applications
Among other things, self-concordant functions are useful in the analysis of Newton's method. Self-concordant barrier functions are used to develop the barrier functions used in interior point methods for convex and nonlinear optimization. The usual analysis of the Newton method would not work for barrier functions as their second derivative cannot be Lipschitz continuous, otherwise they would be bounded on any compact subset of .
Self-concordant barrier functions
- are a class of functions that can be used as barriers in constrained optimization methods
- can be minimized using the Newton algorithm with provable convergence properties analogous to the usual case (but these results are somewhat more difficult to derive)
- to have both of the above, the usual constant bound on the third derivative of the function (required to get the usual convergence results for the Newton method) is replaced by a bound relative to the Hessian
Minimizing a self-concordant function
A self-concordant function may be minimized with a modified Newton method where we have a bound on the number of steps required for convergence. We suppose here that is a standard self-concordant function, that is it is self-concordant with parameter .
We define the Newton decrement of at as the size of the Newton step in the local norm defined by the Hessian of at
Then for in the domain of , if then it is possible to prove that the Newton iterate
will be also in the domain of . This is because, based on the self-concordance of , it is possible to give some finite bounds on the value of . We further have
Then if we have
then it is also guaranteed that , so that we can continue to use the Newton method until convergence. Note that for for some we have quadratic convergence of to 0 as . This then gives quadratic convergence of to and of to , where , by the following theorem. If then
with the following definitions
If we start the Newton method from some with then we have to start by using a damped Newton method defined by
For this it can be shown that with as defined previously. Note that is an increasing function for so that for any , so the value of is guaranteed to decrease by a certain amount in each iteration, which also proves that is in the domain of .
Examples
Self-concordant functions
- Linear and convex quadratic functions are self-concordant with since their third derivative is zero.
- Any function where is defined and convex for all and verifies , is self concordant on its domain which is . Some examples are
- for
- for
- for any function satisfying the conditions, the function with also satisfies the conditions
Functions that are not self-concordant
Self-concordant barriers
- positive half-line : with domain is a self-concordant barrier with .
- positive orthant : with
- the logarithmic barrier for the quadratic region defined by where where is a semi-definite symmetric matrix is self-concordant for
- second order cone :
- semi-definite cone :
- exponential cone :
- power cone :
References
- Boyd, Stephen P.; Vandenberghe, Lieven (2004). Convex Optimization (pdf). Cambridge University Press. ISBN 978-0-521-83378-3. Retrieved October 15, 2011.
- Nesterov, Yurii; Nemirovskii, Arkadii (January 1994). Interior-Point Polynomial Algorithms in Convex Programming (Bibliography Comments). Society for Industrial and Applied Mathematics. doi:10.1137/1.9781611970791.bm. ISBN 978-0-89871-319-0.
- Nesterov, I︠U︡. E. (1994). Interior-point polynomial algorithms in convex programming. Nemirovskiĭ, Arkadiĭ Semenovich. Philadelphia: Society for Industrial and Applied Mathematics. ISBN 0-89871-319-6. OCLC 29310677.
- Yu. E. NESTEROV, Polynomial time methods in linear and quadratic programming, Izvestija AN SSSR, Tekhnitcheskaya kibernetika,No. 3, 1988, pp. 324-326. (In Russian.)
- Yu. E. NESTEROV, Polynomial time iterative methods in linear and quadratic programming, Voprosy kibernetiki, Moscow,1988, pp. 102-125. (In Russian.)
- Y.E. Nesterov and A.S. Nemirovski, Self–concordant functions and polynomial–time methods in convex programming, Technical report, Central Economic and Mathematical Institute, USSR Academy of Science, Moscow, USSR, 1989.
- Nesterov, I︠U︡. E. Introductory lectures on convex optimization : a basic course. Boston. ISBN 978-1-4419-8853-9. OCLC 883391994.