Graduate Thesis Or Dissertation

 

High-Order Automatic Differentiation of Unmodified Linear Algebra Routines via Nilpotent Matrices Public Deposited

https://scholar.colorado.edu/concern/graduate_thesis_or_dissertations/sf2685314
Abstract
  • This work presents a new automatic differentiation method, Nilpotent Matrix Differentiation (NMD), capable of propagating any order of mixed or univariate derivative through common linear algebra functions – most notably third-party sparse solvers and decomposition routines, in addition to basic matrix arithmetic operations and power series – without changing data-type or modifying code line by line; this allows differentiation across sequences of arbitrarily many such functions with minimal implementation effort. NMD works by enlarging the matrices and vectors passed to the routines, replacing each original scalar with a matrix block augmented by derivative data; these blocks are constructed with special sparsity structures, termed “stencils,” each designed to be isomorphic to a particular multidimensional hypercomplex algebra. The algebras are in turn designed such that Taylor expansions of hypercomplex function evaluations are finite in length and thus exactly track derivatives without approximation error.

    Although this use of the method in the “forward mode” is unique in its own right, it is also possible to apply it to existing implementations of the (first-order) discrete adjoint method to find high-order derivatives with lowered cost complexity; for example, for a problem with N inputs and an adjoint solver whose cost is independent of N – i.e., O(1) – the N × N Hessian can be found in O(N) time, which is comparable to existing second-order adjoint methods that require far more problem-specific implementation effort. Higher derivatives are likewise less expensive – e.g., a N × N × N rank-three tensor can be found in O(N2). Alternatively, a Hessian-vector product can be found in O(1) time, which may open up many matrix-based simulations to a range of existing optimization or surrogate modeling approaches. As a final corollary in parallel to the NMD-adjoint hybrid method, the existing complex-step differentiation (CD) technique is also shown to be capable of finding the Hessian-vector product. All variants are implemented on a stochastic diffusion problem and compared in-depth with various cost and accuracy metrics.

Creator
Date Issued
  • 2017
Academic Affiliation
Advisor
Committee Member
Degree Grantor
Commencement Year
Subject
Dernière modification
  • 2020-02-11
Resource Type
Déclaration de droits
Language

Des relations

Articles