The Simons Observatory: assessing the impact of dust complexity on the recovery of primordial B -modes

Journal of Cosmology and Astroparticle Physics IOP Publishing 2025:11 (2025) 024

Authors:

Yiqi Liu, Susanna Azzoni, Susan E Clark, Brandon S Hensley, Léo Vacher, David Alonso, Carlo Baccigalupi, Michael L Brown, Alessandro Carones, Jens Chluba, Jo Dunkley, Carlos Hervías-Caimapo, Bradley R Johnson, Nicoletta Krachmalnicoff, Giuseppe Puglisi, Mathieu Remazeilles, Kevin Wolz

Abstract:

We investigate how dust foreground complexity can affect measurements of the tensor-to-scalar ratio, r, in the context of the Simons Observatory, using a cross-spectrum component separation analysis. Employing a suite of simulations with realistic Galactic dust emission, we find that spatial variation in the dust frequency spectrum, parametrized by βd , can bias the estimate for r when modeled using a low-order moment expansion to capture this spatial variation. While this approach performs well across a broad range of dust complexity, the bias increases with more extreme spatial variation in dust frequency spectrum, reaching as high as r ∼ 0.03 for simulations with no primordial tensors and a spatial dispersion of σ(βd ) ≃ 0.3 — the most extreme case considered, yet still consistent with current observational constraints. This bias is driven by changes in the ℓ-dependence of the dust power spectrum as a function of frequency that can mimic a primordial B-mode tensor signal. Although low-order moment expansions fail to capture the full effect when the spatial variations of βd become large and highly non-Gaussian, our results show that extended parametric methods can still recover unbiased estimates of r under a wide range of dust complexities. We further find that the bias in r, at the highest degrees of dust complexity, is largely insensitive to the spatial structure of the dust amplitude and is instead dominated by spatial correlations between βd and dust amplitude, particularly at higher orders. If βd does spatially vary at the highest levels investigated here, we would expect to use more flexible foreground models to achieve an unbiased constraint on r for the noise levels anticipated from the Simons Observatory.

The Velocity Field Olympics: Assessing velocity field reconstructions with direct distance tracers

Monthly Notices of the Royal Astronomical Society Oxford University Press (OUP) (2025) staf1960

Authors:

Richard Stiskalek, Harry Desmond, Julien Devriendt, Adrianne Slyz, Guilhem Lavaux, Michael J Hudson, Deaglan J Bartlett, Hélène M Courtois

Abstract:

Abstract The peculiar velocity field of the local Universe provides direct insights into its matter distribution and the underlying theory of gravity, and is essential in cosmological analyses for modelling deviations from the Hubble flow. Numerous methods have been developed to reconstruct the density and velocity fields at z ≲ 0.05, typically constrained by redshift-space galaxy positions or by direct distance tracers such as the Tully–Fisher relation, the fundamental plane, or Type Ia supernovae. We introduce a validation framework to evaluate the accuracy of these reconstructions against catalogues of direct distance tracers. Our framework assesses the goodness-of-fit of each reconstruction using Bayesian evidence, residual redshift discrepancies, velocity scaling, and the need for external bulk flows. Applying this framework to a suite of reconstructions—including those derived from the Bayesian Origin Reconstruction from Galaxies (BORG) algorithm and from linear theory—we find that the non-linear BORG reconstruction consistently outperforms others. We highlight the utility of such a comparative approach for supernova or gravitational wave cosmological studies, where selecting an optimal peculiar velocity model is essential. Additionally, we present calibrated bulk flow curves predicted by the reconstructions and perform a density–velocity cross-correlation using a linear theory reconstruction to constrain the growth factor, yielding S8 = 0.793 ± 0.035. The result is in good agreement with both weak lensing and Planck, but is in strong disagreement with some peculiar velocity studies.

The Velocity Field Olympics: Assessing velocity field reconstructions with direct distance tracers

Monthly Notices of the Royal Astronomical Society Oxford University Press (OUP) (2025) staf1960

Authors:

Richard Stiskalek, Harry Desmond, Julien Devriendt, Adrianne Slyz, Guilhem Lavaux, Michael J Hudson, Deaglan J Bartlett, Hélène M Courtois

Abstract:

Abstract The peculiar velocity field of the local Universe provides direct insights into its matter distribution and the underlying theory of gravity, and is essential in cosmological analyses for modelling deviations from the Hubble flow. Numerous methods have been developed to reconstruct the density and velocity fields at z ≲ 0.05, typically constrained by redshift-space galaxy positions or by direct distance tracers such as the Tully–Fisher relation, the fundamental plane, or Type Ia supernovae. We introduce a validation framework to evaluate the accuracy of these reconstructions against catalogues of direct distance tracers. Our framework assesses the goodness-of-fit of each reconstruction using Bayesian evidence, residual redshift discrepancies, velocity scaling, and the need for external bulk flows. Applying this framework to a suite of reconstructions—including those derived from the Bayesian Origin Reconstruction from Galaxies (BORG) algorithm and from linear theory—we find that the non-linear BORG reconstruction consistently outperforms others. We highlight the utility of such a comparative approach for supernova or gravitational wave cosmological studies, where selecting an optimal peculiar velocity model is essential. Additionally, we present calibrated bulk flow curves predicted by the reconstructions and perform a density–velocity cross-correlation using a linear theory reconstruction to constrain the growth factor, yielding S8 = 0.793 ± 0.035. The result is in good agreement with both weak lensing and Planck, but is in strong disagreement with some peculiar velocity studies.

A 1-per cent-accurate method to include baryonic effects in galaxy–galaxy lensing models

Monthly Notices of the Royal Astronomical Society Oxford University Press 544:4 (2025) 3512-3532

Authors:

Matteo Zennaro, Giovanni Aricò, Carlos García-García, Raúl E Angulo, Lurdes Ondaro-Mallea, Sergio Contreras, Andrina Nicola, Matthieu Schaller, Joop Schaye

Abstract:

The clustering of galaxies and galaxy–galaxy lensing are two of the main observational probes in Stage-IV large-scale structure surveys, such as Euclid and LSST. Unfortunately, the complicated relationship between galaxies and matter greatly limits the exploitation of this data. Sophisticated theoretical galaxy bias models–such as the hybrid Lagrangian bias expansion – allow describing galaxy clustering down to scales as small as . However, the galaxy–matter cross-power spectra are already affected by baryons on these scales, directly impacting the modelling of galaxy–galaxy lensing. In this work, we propose a way to extend state-of-the-art models of the galaxy–matter cross-power spectrum (currently only accounting for dark matter) by including a baryonic correction term inferred from the matter component [the suppression ], so that . We use the FLAMINGO hydrodynamical simulations to measure the effect of baryons on the galaxy–matter cross-power spectrum and to assess the performance of our model. Specifically, we perform a Bayesian analysis of synthetic data, implementing a model based on BACCO’s hybrid Lagrangian bias expansion (for the non-linear galaxy bias) and Baryon Correction Model (for the baryon suppression of the matter power spectrum). Ignoring the effect of baryons on the galaxy–matter cross-power spectrum leads to a biased inference of the galaxy bias parameters, while ignoring baryons in both the galaxy–matter and matter–matter power spectra leads to a biased inference of both the galaxy bias and cosmological parameters. In contrast, our method is 1 per cent accurate compared to all physics variations in FLAMINGO and on all scales described by hybrid perturbative models (). Moreover, our model leads to inferred bias and cosmological parameters compatible within 1 with their reference values. We anticipate that our method will be a promising candidate for analysing forthcoming Stage-IV survey data.

Creating halos with autoregressive multistage networks

Physical Review D American Physical Society (APS) 112:10 (2025) 103503

Authors:

Shivam Pandey, Chirag Modi, Benjamin D Wandelt, Deaglan J Bartlett, Adrian E Bayer, Greg L Bryan, Matthew Ho, Guilhem Lavaux, T Lucas Makinen, Francisco Villaescusa-Navarro