Derivatives play a critical role in science. Techniques able to automatically and efficiently differentiate can dramatically reduce runtime for applications from machine learning to Monte Carlo. However, implementation of efficient automatic differentiation (AD) in a high performance programming language is not easy. Challenges include language rules that are too numerous, too complex and that evolve too often for a custom parser to handle. Implementations rely on custom parsers or other language facilities.
This mini-workshop aims to discuss new approaches to flexible, scalable and efficient techniques for AD and their application to data-intensive science domains.
Organized by Marco Foco (NVIDIA), William Moses (MIT), Vassil Vassilev (Princeton), David Lange (Princeton)