Dec 8 – 10, 2025
CERN
Europe/Zurich timezone

Collapsing Taylor Mode Automatic Differentiation

Dec 8, 2025, 3:55 PM
20m
31/3-004 - IT Amphitheatre (CERN)

31/3-004 - IT Amphitheatre

CERN

105
Show room on map
Contributed Talk Contributed Talks

Speaker

Tim Siebert (Humboldt-Universität zu Berlin and Zuse Institute Berlin, Berlin, Germany)

Description

Computing partial differential equation (PDE) operators via nested backpropagation is expensive, yet popular, and severely restricts their utility for scientific machine learning. Recent advances, like the forward Laplacian and randomizing Taylor mode automatic differentiation (AD), propose forward schemes to address this. We introduce an optimization technique for Taylor mode that "collapses" derivatives by rewriting the computational graph, and demonstrate how to apply it to general linear PDE operators, and randomized Taylor mode. The modifications simply require propagating a sum up the computational graph, which could --or should-- be done by a machine learning compiler, without exposing complexity to users. We implement our collapsing procedure and evaluate it on popular PDE operators, confirming it accelerates Taylor mode and outperforms nested backpropagation.

Authors

Felix Dangel (Vector Institute, Toronto Canada) Tim Siebert (Humboldt-Universität zu Berlin and Zuse Institute Berlin, Berlin, Germany)

Co-authors

Andrea Walther (Humboldt-Universität zu Berlin and Zuse Institute Berlin, Berlin, Germany) Marius Zeinhofer (ETH Zurich, Switzerland)

Presentation materials