Optimizing Hardware Capacity, Utilizing Automatic Differentiation to Efficiently Compute Derivatives in Parallel Programming Models


A technical paper titled "Scalable Automatic Differentiation of Multiple Parallel Paradigms through Compiler Augmentation" was published by researchers at MIT (CSAIL), Argonne National Lab, and TU Munich. The paper was a Best Paper Finalist and a Best Student Paper winner at SuperComputing 2022. Find the technical paper here. Published November 2022. The work "demonstrates how Enzyme opti... » read more