Science with the Einstein Telescope: a comparison of different designs
Journal of Cosmology and Astroparticle Physics IOP Publishing 2023 (2023) 068
Abstract:
The Einstein Telescope (ET), the European project for a third-generation gravitational-wave detector, has a reference configuration based on a triangular shape consisting of three nested detectors with 10 km arms, where each detector has a 'xylophone' configuration made of an interferometer tuned toward high frequencies, and an interferometer tuned toward low frequencies and working at cryogenic temperature. Here, we examine the scientific perspectives under possible variations of this reference design. We perform a detailed evaluation of the science case for a single triangular geometry observatory, and we compare it with the results obtained for a network of two L-shaped detectors (either parallel or misaligned) located in Europe, considering different choices of arm-length for both the triangle and the 2L geometries. We also study how the science output changes in the absence of the low-frequency instrument, both for the triangle and the 2L configurations. We examine a broad class of simple 'metrics' that quantify the science output, related to compact binary coalescences, multi-messenger astronomy and stochastic backgrounds, and we then examine the impact of different detector designs on a more specific set of scientific objectives.Cosmology with 6 parameters in the Stage-IV era: efficient marginalisation over nuisance parameters
Open Journal of Astrophysics Maynooth Academic Publishing 6 (2023)
Abstract:
The analysis of photometric large-scale structure data is often complicated by the need to account for many observational and astrophysical systematics. The elaborate models needed to describe them often introduce many "nuisance parameters’', which can be a major inhibitor of an efficient parameter inference. In this paper we introduce an approximate method to analytically marginalise over a large number of nuisance parameters based on the Laplace approximation. We discuss the mathematics of the method, its relation to concepts such as volume effects and profile likelihood, and show that it can be further simplified for calibratable systematics by linearising the dependence of the theory on the associated parameters. We quantify the accuracy of this approach by comparing it with traditional sampling methods in the context of existing data from the Dark Energy Survey, as well as futuristic Stage-IV photometric data. The linearised version of the method is able to obtain parameter constraints that are virtually equivalent to those found by exploring the full parameter space for a large number of calibratable nuisance parameters, while reducing the computation time by a factor 3-10. Furthermore, the non-linearised approach is able to analytically marginalise over a large number of parameters, returning constraints that are virtually indistinguishable from the brute-force method in most cases, accurately reproducing both the marginalised uncertainty on cosmological parameters, and the impact of volume effects associated with this marginalisation. We provide simple recipes to diagnose when the approximations made by the method fail and one should thus resort to traditional methods. The gains in sampling efficiency associated with this method enable the joint analysis of multiple surveys, typically hindered by the large number of nuisance parameters needed to describe them.The catalog-to-cosmology framework for weak lensing and galaxy clustering for LSST
Open Journal of Astrophysics Maynooth Academic Publishing 6 (2023)
Abstract:
We present TXPipe, a modular, automated and reproducible pipeline for ingesting catalog data and performing all the calculations required to obtain quality-assured two-point measurements of lensing and clustering, and their covariances, with the metadata necessary for parameter estimation. The pipeline is developed within the Rubin Observatory Legacy Survey of Space and Time (LSST) Dark Energy Science Collaboration (DESC), and designed for cosmology analyses using LSST data. In this paper, we present the pipeline for the so-called 3x2pt analysis – a combination of three two-point functions that measure the auto- and cross-correlation between galaxy density and shapes. We perform the analysis both in real and harmonic space using TXPipe and other LSST-DESC tools. We validate the pipeline using Gaussian simulations and show that it accurately measures data vectors and recovers the input cosmology to the accuracy level required for the first year of LSST data under this simplified scenario. We also apply the pipeline to a realistic mock galaxy sample extracted from the CosmoDC2 simulation suite (Korytov et al. 2019). TXPipe establishes a baseline framework that can be built upon as the LSST survey proceeds. Furthermore, the pipeline is designed to be easily extended to science probes beyond the 3x2pt analysis.Analytical marginalization over photometric redshift uncertainties in cosmic shear analyses
Monthly Notices of the Royal Astronomical Society Oxford University Press 522:4 (2023) 5037-5048
Abstract:
As the statistical power of imaging surveys grows, it is crucial to account for all systematic uncertainties. This is normally done by constructing a model of these uncertainties and then marginalizing over the additional model parameters. The resulting high dimensionality of the total parameter spaces makes inferring the cosmological parameters significantly more costly using traditional Monte Carlo sampling methods. A particularly relevant example is the redshift distribution, p(z ), of the source samples, which may require tens of parameters to describe fully. However, relatively tight priors can be usually placed on these parameters through calibration of the associated systematics. In this paper, we show, quantitatively, that a linearization of the theoretical prediction with respect to these calibrated systematic parameters allows us to analytically marginalize over these extra parameters, leading to a factor of ∼30 reduction in the time needed for parameter inference, while accurately recovering the same posterior distributions for the cosmological parameters that would be obtained through a full numerical marginalization over 160 p(z ) parameters. We demonstrate that this is feasible not only with current data and current achievable calibration priors but also for future Stage-IV data sets.The N5K challenge: non-limber integration for LSST cosmology
(2023)