search_query=cat:astro-ph.*+AND+lastUpdatedDate:[202512042000+TO+202512102000]&start=0&max_results=5000

New astro-ph.* submissions cross listed on cs.LG, stat.*, cs.AI, physics.data-an staritng 202512042000 and ending 202512102000

Feed last updated: 2025-12-10T04:24:43Z

Generalized tension metrics for multiple cosmological datasets

Authors: Matías Leizerovich, Susana J. Landau, Claudia G. Scóccola
Comments: 6 pages, 5 figures, 2 tables
Primary Category: astro-ph.CO
All Categories: astro-ph.CO, astro-ph.IM, hep-ex, hep-ph, physics.data-an

We introduce a novel estimator to quantify statistical tensions among multiple cosmological datasets simultaneously. This estimator generalizes the Difference-in-Means statistic, $Q_{\rm DM}$, to the multi-dataset regime. Our framework enables the detection of dominant tension directions in the shared parameter space. It further provides a geometric interpretation of the tension for the two- and three-dataset cases in two dimensions. According to this approach, the previously reported increase in tension between DESI and Planck from $1.9σ$ (DR1) to $2.3σ$(DR2) is reinterpreted as a more modest shift from $1.18σ^{\rm eff}$ (DR1) to $1.45σ^{\rm eff}$ (DR2). These new tools may also prove valuable across research fields where dataset discrepancies arise.


Magnetic activity of ultracool dwarfs in the LAMOST DR11

Authors: Yue Xiang, Shenghong Gu, Dongtao Cao
Comments: 13 pages, 10 figures, accepted for publication in ApJ
Primary Category: astro-ph.SR
All Categories: astro-ph.SR, astro-ph.IM, cs.LG

Ultracool dwarfs consist of lowest-mass stars and brown dwarfs. Their interior is fully convective, different from that of the partly-convective Sun-like stars. Magnetic field generation process beneath the surface of ultracool dwarfs is still poorly understood and controversial. To increase samples of active ultracool dwarfs significantly, we have identified 962 ultracool dwarfs in the latest LAMOST data release, DR11. We also simulate the Chinese Space Station Survey Telescope (CSST) low-resolution slitless spectra by degrading the LAMOST spectra. A semi-supervised machine learning approach with an autoencoder model is built to identify ultracool dwarfs with the simulated CSST spectra, which demonstrates the capability of the CSST all-sky slitless spectroscopic survey on the detection of ultracool dwarfs. Magnetic activity of the ultracool dwarfs is investigated by using the H$α$ line emission as a proxy. The rotational periods of 82 ultracool dwarfs are derived based on the Kepler/K2 light curves. We also derive the activity-rotation relation of the ultracool dwarfs, which is saturated around a Rossby number of 0.12.


Exoplanet formation inference using conditional invertible neural networks

Authors: Remo Burn, Victor F. Ksoll, Hubert Klahr, Thomas Henning
Comments: 10 pages, accepted poster for the Machine Learning and the Physical Sciences Workshop at the 39th conference on Neural Information Processing Systems (NeurIPS 2025)
Primary Category: astro-ph.EP
All Categories: astro-ph.EP, cs.NE, physics.data-an

The interpretation of the origin of observed exoplanets is usually done only qualitatively due to uncertainties of key parameters in planet formation models. To allow a quantitative methodology which traces back in time to the planet birth locations, we train recently developed conditional invertible neural networks (cINN) on synthetic data from a global planet formation model which tracks growth from dust grains to evolved final giant planets. In addition to deterministic single planet formation runs, we also include gravitationally interacting planets in multiplanetary systems, which include some measure of chaos. For the latter case, we treat them as individual planets or choose the two or three planets most likely to be discovered by telescopes. We find that training on multiplanetary data, each planet treated as individual point, is promising. The single-planet data only covers a small range of planets and does not extrapolate well to planet properties not included in the training data. Extension to planetary systems will require more training data due to the higher dimensionality of the problem.


Masked Autoencoder Pretraining on Strong-Lensing Images for Joint Dark-Matter Model Classification and Super-Resolution

Authors: Achmad Ardani Prasha, Clavino Ourizqi Rachmadi, Muhamad Fauzan Ibnu Syahlan, Naufal Rahfi Anugerah, Nanda Garin Raditya, Putri Amelia, Sabrina Laila Mutiara, Hilman Syachr Ramadhan
Comments: 21 pages, 7 figures, 3 table
Primary Category: cs.CV
All Categories: cs.CV, astro-ph.CO, astro-ph.IM, cs.AI, cs.LG

Strong gravitational lensing can reveal the influence of dark-matter substructure in galaxies, but analyzing these effects from noisy, low-resolution images poses a significant challenge. In this work, we propose a masked autoencoder (MAE) pretraining strategy on simulated strong-lensing images from the DeepLense ML4SCI benchmark to learn generalizable representations for two downstream tasks: (i) classifying the underlying dark matter model (cold dark matter, axion-like, or no substructure) and (ii) enhancing low-resolution lensed images via super-resolution. We pretrain a Vision Transformer encoder using a masked image modeling objective, then fine-tune the encoder separately for each task. Our results show that MAE pretraining, when combined with appropriate mask ratio tuning, yields a shared encoder that matches or exceeds a ViT trained from scratch. Specifically, at a 90% mask ratio, the fine-tuned classifier achieves macro AUC of 0.968 and accuracy of 88.65%, compared to the scratch baseline (AUC 0.957, accuracy 82.46%). For super-resolution (16x16 to 64x64), the MAE-pretrained model reconstructs images with PSNR ~33 dB and SSIM 0.961, modestly improving over scratch training. We ablate the MAE mask ratio, revealing a consistent trade-off: higher mask ratios improve classification but slightly degrade reconstruction fidelity. Our findings demonstrate that MAE pretraining on physics-rich simulations provides a flexible, reusable encoder for multiple strong-lensing analysis tasks.