search_query=cat:astro-ph.*+AND+lastUpdatedDate:[202604022000+TO+202604082000]&start=0&max_results=5000

New astro-ph.* submissions cross listed on cs.LG, cs.AI, physics.data-an, stat.* staritng 202604022000 and ending 202604082000

Feed last updated: 2026-04-08T05:42:20Z

Replacing Gaussian Processes with Neural Networks in Pulsar Timing Array Inference of the Gravitational-Wave Background

Authors: Shreyas Tiruvaskar, Chris Gordon
Comments: 14 pages, 9 figures
Primary Category: astro-ph.CO
All Categories: astro-ph.CO, physics.data-an

Bayesian inference of nanohertz gravitational-wave background models in pulsar timing array analyses often relies on Gaussian-process interpolators to avoid repeated, computationally expensive strain-spectrum calculations. However, Gaussian-process training becomes a bottleneck for large training sets. We test whether probabilistic neural networks can replace Gaussian processes in this role for both a self-interacting dark matter model and a phenomenological environmental model. We find that neural networks recover consistent posteriors while significantly reducing both training and Markov chain Monte Carlo runtime, with the largest gains for the more computationally demanding model.


FluxMC: Rapid and High-Fidelity Inference for Space-Based Gravitational-Wave Observations

Authors: Bo Liang, Chang Liu, Hanlin Song, Tianyu Zhao, Minghui Du, He Wang, Haohao Gu, Sensen He, Yuxiang Xu, Wei-Liang Qian, Li-e Qiang, Peng Xu, Ziren Luo, Mingming Sun
Comments: No comment found
Primary Category: astro-ph.IM
All Categories: astro-ph.IM, gr-qc, physics.data-an

Bayesian inference in the physical sciences faces a fundamental challenge: the imperative for high-fidelity physical modeling often clashes with the intrinsic limitations of stochastic sampling algorithms. Complex, high-dimensional parameter spaces expose the universal vulnerability of conventional methods, e.g., Markov Chain Monte Carlo (MCMC), which struggle with the prohibitive costs of likelihood evaluations and the risk of entrapment in local optima. To resolve this impasse, we introduce FluxMC (Flow-guided Unbiased eXploration Monte Carlo), a machine learning-enhanced framework designed to shift the inference paradigm from blind local search to globally guided transport. It integrates Flow Matching with Parallel Tempering MCMC, effectively combining the global foresight of generative AI with the rigorous asymptotic convergence and local robustness of temperature-based sampling. We showcase the efficacy of this framework through the lens of space-based gravitational-wave (GW) astronomy -- a field representing the frontier of challenging parameter inversion. In the analysis of massive black hole binaries using high-fidelity waveforms (IMRPhenomHM), FluxMC achieves robust convergence in under five hours, whereas traditional Parallel Tempering MCMC fails to converge even after hundreds of hours, yielding high Jensen-Shannon divergences (JSD) of $O(10^{-1})$. Our method reduces the distributional error by two to three orders of magnitude. Furthermore, for computationally efficient models (IMRPhenomD), it eliminates systematic biases caused by local-optima entrapment. Ultimately, FluxMC removes the necessity to compromise between model accuracy and analysis speed, establishing a new computational foundation where scientific discovery is limited only by observational data quality, not by algorithmic capacity.