Location: Math Department, Niels Henrik Abels hus, 8th floor
Speaker: Hans J. Skaug, University of Bergen (NOR)
Title: What exact derivatives can do for statisticians
Abstract: Exact numerical derivatives are currently powering the deep learning revolution in machine learning. The underlying algorithm is called backpropagation, and is an efficient way of calculating the gradient of the objective function with respect to the model parameters. The same algorithm can be applied repeatedly to obtain first and higher order derivatives of any computer program in an automatic manner, and is then often referred to as automatic differentiation (AD). I will give an overview of statistical methods that are well suited for AD. Among these are Laplace and saddlepoint approximations, modified profile likelihood, Hamiltonian Monte Carlo, Fisher information matrices. My main point is that AD makes these methods much easier accessible to a broad audience. I will show examples implemented in the software system TMB (https://github.com/kaskr/adcomp).
Riccardo De Bin – firstname.lastname@example.org
Emanuele Gramuglia – email@example.com