Runa Eschenhagen

PhD student, University of Cambridge

re393 [AT] cam.ac.uk

About

I am a PhD student in the Machine Learning Group within the Computational and Biological Learning Lab at the University of Cambridge, supervised by Richard Turner. I am broadly interested in better understanding and improving deep learning. Currently, I work on neural network training algorithms and dynamics.

During my PhD, I have interned at Meta (FAIR) in New York where I worked with Aaron Defazio and Hao-Jun Michael Shi. In my second year, I received the Qualcomm Innovation Fellowship.

Previously, I obtained an MSc in Machine Learning from the University of Tübingen and worked as a research assistant in the Methods of Machine Learning group led by Philipp Hennig. I received a BSc in Cognitive Science from the University of Osnabrück and spent my final year as an intern in the Approximate Bayesian Inference Team led by Emtiyaz Khan at RIKEN AIP in Tokyo.

Publications

* / / indicate equal contribution. Also, see my Google Scholar page.

Kronecker-factored Approximate Curvature for Linear Weight-Sharing Layers

Runa Eschenhagen

MSc Thesis, University of Tübingen, 2023

Natural Gradient Variational Inference for Continual Learning in Deep Neural Networks

Runa Eschenhagen

BSc Thesis, University of Osnabrück, 2019

Influence Functions for Scalable Data Attribution in Diffusion Models

Bruno Mlodozeniec, Runa Eschenhagen, Juhan Bae, Alexander Immer, David Krueger, Richard Turner

Preprint, 2024

Can We Remove the Square-Root in Adaptive Gradient Methods? A Second-Order Perspective

Wu Lin, Felix Dangel, Runa Eschenhagen, Juhan Bae, Richard E. Turner, Alireza Makhzani

ICML 2024

Structured Inverse-Free Natural Gradient: Memory-Efficient & Numerically-Stable KFAC for Large Neural Nets

Wu Lin*, Felix Dangel*, Runa Eschenhagen, Kirill Neklyudov, Agustinus Kristiadi, Richard E. Turner, Alireza Makhzani

ICML 2024

Kronecker-Factored Approximate Curvature for Modern Neural Network Architectures

Runa Eschenhagen, Alexander Immer, Richard E. Turner, Frank Schneider, Philipp Hennig

NeurIPS 2023 (spotlight)

Benchmarking Neural Network Training Algorithms

George E. Dahl*, Frank Schneider*, Zachary Nado*, Naman Agarwal*, Chandramouli Shama Sastry, Philipp Hennig, Sourabh Medapati, Runa Eschenhagen, Priya Kasimbeg, Daniel Suo, Juhan Bae, Justin Gilmer, Abel L. Peirson, Bilal Khan, Rohan Anil, Mike Rabbat, Shankar Krishnan, Daniel Snider, Ehsan Amid, Kongtao Chen, Chris J. Maddison, Rakshith Vasudev, Michal Badura, Ankush Garg, Peter Mattson

Preprint, 2023

Promises and Pitfalls of the Linearized Laplace in Bayesian Optimization

Agustinus Kristiadi, Alexander Immer, Runa Eschenhagen, Vincent Fortuin

AABI 2023

Kronecker-factored Approximate Curvature for Linear Weight-Sharing Layers

Runa Eschenhagen

MSc Thesis, University of Tübingen, 2023

Posterior Refinement Improves Sample Efficiency in Bayesian Neural Networks

Agustinus Kristiadi, Runa Eschenhagen, Philipp Hennig

NeurIPS 2022

Approximate Bayesian Neural Operators: Uncertainty Quantification for Parametric PDEs

Emilia Magnani, Nicholas Krämer, Runa Eschenhagen, Lorenzo Rosasco, Philipp Hennig

Preprint, 2022

Mixtures of Laplace Approximations for Improved Post-Hoc Uncertainty in Deep Learning

Runa Eschenhagen, Erik Daxberger, Philipp Hennig, Agustinus Kristiadi

Bayesian Deep Learning Workshop, NeurIPS 2021

Laplace Redux—Effortless Bayesian Deep Learning

Erik Daxberger*, Agustinus Kristiadi*, Alexander Immer*, Runa Eschenhagen*, Matthias Bauer, Philipp Hennig

NeurIPS 2021

Continual Deep Learning by Functional Regularisation of Memorable Past

Pingbo Pan*, Siddharth Swaroop*, Alexander Immer, Runa Eschenhagen, Richard E. Turner, Mohammad Emtiyaz Khan

NeurIPS 2020

Natural Gradient Variational Inference for Continual Learning in Deep Neural Networks

Runa Eschenhagen

BSc Thesis, University of Osnabrück, 2019

Practical Deep Learning with Bayesian Principles

Kazuki Osawa, Siddharth Swaroop*, Anirudh Jain*, Runa Eschenhagen, Richard E. Turner, Rio Yokota, Mohammad Emtiyaz Khan

NeurIPS 2019