Reports

 

On the Q-Superlinear Convergence of Self-Scaling Quasi-Newton Methods ; CU-CS-144-78 Public Deposited

Downloadable Content

Download PDF
https://scholar.colorado.edu/concern/reports/ns064677m
Abstract
  • Self-scaling updates have been proposed by Luenberger, Oren, and Spedicato for use in quasi-Newton minimization algorithms. Their departure from other updates is that the intermediate update Hi=ɤi Hi to the inverse Hessian approximation is performed before each regular update. In recent computational tests by Brodlie and Shanno and Phua, they performed less well than the BFGS update except on problems with a singular Hessian at the solution. In this paper we examine the self-scaling updates in an attempt to explain this behavior. We find that for the self-scaling BFGS update to retain the Q-superlinear convergence of the normal BFGS on problems with a non-dingular Hessian at the solution, it is necessary that ɤi converge to 1; a somewhat stronger condition is sufficient. This indicates that asymptotically, use of the scaling parameter is unlikely to be advantageous on non-singular problems. On the other hand, on problmes with a singular Hessian at the solution, where only linear convergenceis expected in general, ɤi does not necessarily converge to 1, so that the self-scaling update may differ from the BFGS even asymptotically.
Creator
Date Issued
  • 1978-08-01
Academic Affiliation
Last Modified
  • 2019-12-21
Resource Type
Rights Statement
Language

Relationships

Items