Presentation

Synthesizing Gradient for Parallel Paradigms Using Enzyme in Julia
DescriptionAutomatic differentiation (AD), i.e., the augmentation of a program to compute derivatives algorithmically, is a powerful tool in scientific computing, with a wide range of applications such as sensitivity analysis, uncertainty quantification, shape optimization, or machine learning. Automatic differentiation can be implemented on different layers of the programming stack, common approaches include operator overloading or source-to-source transformation. Enzyme is a framework for compiler-aided automatic differentiation performing automatic differentiation directly on LLVM's intermediate representation. AD over parallel programs brings with it a host of challenges and in this talk I will discuss how Enzyme can be used in Julia to synthesize gradients, in particular gradients of codes that uses parallel paradigms such as tasks, MPI or GPU programming.
TimeWednesday, June 2812:00 - 12:30 CEST
LocationFlüela
Event Type
Minisymposium
Domains
Computer Science, Machine Learning, and Applied Mathematics