Submitted by Josh Stevens on Wed, 14/05/2025 - 10:47
Derivatives are at the core of scientific computing: from the Jacobian matrices used in nonlinear solvers to the gradient vectors used in optimisation methods; from the back-propagation operator in machine learning (ML) to the Hessian matrices used in uncertainty quantification methods. Automatic differentiation (AD) - also known as Differentiable Programming or Algorithmic Differentiation - is the name given to technologies that facilitate computing such derivatives from code, without the need for hand derivations. AD was first discovered in the 1950s and has experienced surges and lulls in popularity in the decades since, but is currently particularly popular thanks to its relevance to ML and artificial intelligence.
In early April 2025, I caught the train to Kaiserslautern, Germany, for the 27th European workshop on Automatic Differentiation (EuroAD). The workshop was a small, informal event, where participants were encouraged to present work-in-progress and to discuss ideas. I took this opportunity to share ICCS's current progress on facilitating online training in the Fortran interface for the popular ML package PyTorch, FTorch. As part of this work, we are exposing the functionality of the autograd AD tool from the PyTorch's C++ backend. In doing so, we enable differentiable programming in Fortran, with potential impacts beyond ML such as sensitivity analysis experiments.
The workshop featured a wide range of talks, including new approaches and tools, integration efforts, and applications in various scientific and industrial domains. We heard about recent developments of the pioneering Tapenade AD tool from its lead developer, Laurent Hascoet. Tapenade is the best-known example of a source transformation tool for AD, which uses a static analysis approach rather than the more common operator overloading approach. Jean-Luc Bouchot (who took over as lead developer upon Laurent's retirement since the workshop) presented some preliminary investigations on extending Tapenade to Julia in addition to C and Fortran.
As well as making lots of new connections in the AD research field, EuroAD was an opportunity to meet up with former colleagues from my time interning at Argonne National Laboratory. Sri Hari Krishna Narayanan gave a talk entitled Parametric Sensitivities of a Wind-driven Baroclinic Ocean using Neural Surrogates, which was very relevant to the work we do in ICCS, while Paul Hovland presented on advanced methods for accelerating one part of the AD process.
To summarise what I learnt at EuroAD this year, AD continues to be an active research area. Researchers are developing more efficient strategies for computing derivatives (in particular Hessians and higher order derivatives), while software engineers are writing efficient implementations and bringing differentiability to existing models. Differentiable models bring many opportunities for scientific investigation that would greatly benefit the climate modelling community. However, it can be challenging to introduce to existing models, especially large-scale models written in Fortran. This is where, we hope, FTorch can play a part.
---
The workshop also provided a great opportunity to discuss ideas about how best to run the upcoming Differentiable Programming course at the 2025 ICCS Summer School. Whilst we expect attendees to have Python as the programming language they are most comfortable with, the majority of AD tools are designed for C, Fortran, and Julia. Moreover, the AD tools for Python that do exist are mostly designed specifically for ML and feel somewhat like "black-boxes", meaning that they do not clearly illustrate the AD process. Following many such discussions, we decided that the best approach is to introduce AD with Tapenade (using only very basic C or Fortran syntax) in the first session and then to use PyTorch's autograd tool for the more advanced material in the second session.
If you'll be attending this year's summer school as a VERSI partner or Cambridge student and are interested in getting a better understanding of what happens under the hood of ML models, extending existing code bases to conduct sensitivity analysis experiments, or just playing around with different tools and languages, we look forward to seeing you at the Differentiable Programming course this summer!