Speaker
Laurent Hascoet
(INRIA)
Description
After a detailled introduction on AD, we focus on Source-Transformation reverse AD, a remarkably efficient way to compute gradients. One cornerstone of reverse AD is data-flow reversal, the process of restoring memory states of a computation in reverse order.
While this is by no means cheap, we will present the most efficient storage/recomputation trade-offs that permit data-flow reversal on computation-intensive applications. AD is an active research field and we will conclude with our guess of the most important future challenges.