Globally Convergent Inexact Generalized Newton Method for First-Order Differentiable Optimization Problems
ISSN: |
1573-2878
|
---|---|
Keywords: |
nonsmooth optimization ; inexact Newton methods ; generalized Newton methods ; global convergence ; superlinear rate
|
Source: |
Springer Online Journal Archives 1860-2000
|
Topics: |
Mathematics
|
Notes: |
Abstract Motivated by the method of Martinez and Qi (Ref. 1), we propose in this paper a globally convergent inexact generalized Newton method to solve unconstrained optimization problems in which the objective functions have Lipschitz continuous gradient functions, but are not twice differentiable. This method is implementable, globally convergent, and produces monotonically decreasing function values. We prove that the method has locally superlinear convergence or even quadratic convergence rate under some mild conditions, which do not assume the convexity of the functions.
|
Type of Medium: |
Electronic Resource
|
URL: |