AstroAI Lunch Talks - October 21, 2024 - Kangning Diao & Matthew O'Callaghan
21 Oct 2024 - Joshua Wing
The video can be found here: https://www.youtube.com/watch?v=ihjP4l0xzTw
Speaker 1: Kangning Diao
Title:synax: A Differentiable and GPU-accelerated Synchrotron Simulation Package
Abstract: We introduce synax, a novel library for automatically differentiable simulation of Galactic synchrotron emission. Built on the JAX framework, synax leverages JAX’s capabilities, including batch acceleration, just-in-time compilation, and hardware-specific optimizations (CPU, GPU, TPU). Crucially, synax uses JAX’s automatic differentiation (AD) mechanism, enabling precise computation of derivatives with respect to any model parameters. This feature facilitates powerful inference algorithms, such as Hamiltonian Monte Carlo (HMC) and gradient-based optimization, which enables inference over models that would otherwise be computationally prohibitive. In its initial release, synax supports synchrotron intensity and polarization calculations down to GHz frequencies, alongside several models of the Galactic magnetic field (GMF), cosmic ray (CR) spectra, and thermal electron density fields. We demonstrate the transformative potential of AD for tasks involving full posterior inference using gradient-based techniques or Maximum Likelihood Estimation (MLE) optimization. Notably, we show that GPU acceleration brings a twenty-fold enhancement in efficiency, while HMC achieves a two-fold improvement over standard random walk Metropolis-Hastings (RWMH) when performing inference over a four-parameter test model. HMC still works on a more complex, 16-parameter model while RWMH fails to converge. Additionally, we showcase the application of synax in optimizing the GMF based on the Haslam 408 MHz map, achieving residuals with a standard deviation below 1 K.
Speaker 2: Matthew O’Callaghan
Title: Hamiltonian Monte Carlo with Normalizing Flow Priors
Abstract: Complex, data-driven priors are of paramount interest to the astronomical community. Bayesian inference involves selecting a prior distribution over parameters, a likelihood function for the observed data, and an appropriate inference algorithm. Hamiltonian Monte Carlo (HMC) has emerged as an efficient Markov Chain Monte Carlo (MCMC) algorithm, improving posterior sampling by simulating Hamiltonian dynamics in the proposal step, leading to faster convergence compared to traditional methods. Normalizing flows (NFs) are generative models that, under certain architectural assumptions, can serve as universal density approximators, enabling flexible modeling of complex distributions. In this talk, we begin with a brief introduction to HMC and NFs then investigate the conditions necessary for implementing NF priors in a HMC inference algorithm.
Watch the talk below!