Sidebar

Blog


Blog


https://horea.caramizaru.xyz


โ† Go Back


Search by Tags



You are not allowed to perform this action


Matrix Calculus for Machine Learning and Beyond

Course Description

We all know that calculus courses such as 18.01 Single Variable Calculus and 18.02 Multivariable Calculus cover univariate and vector calculus, respectively. Modern applications such as machine learning and large-scale optimization require the next big step, โ€œmatrix calculusโ€ and calculus on arbitrary vector spaces.

This class covers a coherent approach to matrix calculus showing techniques that allow you to think of a matrix holistically (not just as an array of scalars), generalize and compute derivatives of important matrix factorizations and many other complicated-looking operations, and understand how differentiation formulas must be reimagined in large-scale computing. We will discuss reverse/adjoint/backpropagation differentiation, custom vector-Jacobian products, and how modern automatic differentiation is more computer science than calculus (it is neither symbolic formulas nor finite differences).


Topics:

Here are some of the topics covered:

  1. Derivatives as linear operators and linear approximation on arbitrary vector spaces: beyond gradients and Jacobians.
  2. Derivatives of functions with matrix inputs and/or outputs (e.g. matrix inverses and determinants). Kronecker products and matrix โ€œvectorization.โ€
  3. Derivatives of matrix factorizations (e.g. eigenvalues/SVD) and derivatives with constraints (e.g. orthogonal matrices).
  4. Multidimensional chain rules, and the significance of right-to-left (โ€œforwardโ€) vs. left-to-right (โ€œreverseโ€) composition. Chain rules on computational graphs (e.g. neural networks).
  5. Forward- and reverse-mode manual and automatic multivariate differentiation.
  6. Adjoint methods (vJp/pullback rules) for derivatives of solutions of linear, nonlinear, and differential equations.
  7. Application to nonlinear root-finding and optimization. Multidimensional Newton and steepestโ€“descent methods.
  8. Applications in engineering/scientific optimization and machine learning.
  9. Second derivatives, Hessian matrices, quadratic approximations, and quasi-Newton methods.


Discussion

Enter your comment. Wiki syntax is allowed:
 
feed/2024/03/24/matrix_calculus_for_machine_learning_and_beyond.txt ยท Last modified: 2024/04/27 11:53 by Horea Caramizaru