CONVERGENCE AND CONVERGENCE RATES OF DAMPED NEWTON METHODS
Abstract
In this paper, we study the convergence and convergence rates of damped Newton algorithms for solving unconstrained optimization problems with twice continuously differentiable objective functions. Under the assumption of the positive definiteness of the Hessian matrix of the objective function on an open set containing the level set corresponding to the value of the objective function at the starting point, we prove that the sequence generated by the damped Newton algorithm belongs to that open set, and the corresponding sequence of objective function values is monotonically decreasing. If the sequence has a limit point, that limit point is a locally strong minimum of the objective function, and the iterative sequence superlinearly globally converges to this minimizer. Furthermore, if the Hessian matrix of the objective function is Lipschitz continuous, the iterative sequence achieves the quadratic convergence rate.