MIGHTEE: The Continuum Survey Data Release 1

(2024)

Authors:

CL Hale, I Heywood, MJ Jarvis, IH Whittam, PN Best, Fangxia An, RAA Bowler, I Harrison, A Matthews, DJB Smith, AR Taylor, M Vaccari

The Dark Energy Survey Supernova Program: Light Curves and 5 Yr Data Release

The Astrophysical Journal American Astronomical Society 975:1 (2024) 5

Authors:

BO Sánchez, D Brout, M Vincenzi, M Sako, K Herner, R Kessler, TM Davis, D Scolnic, M Acevedo, J Lee, A Möller, H Qu, L Kelsey, P Wiseman, P Armstrong, B Rose, R Camilleri, R Chen, L Galbany, E Kovacs, C Lidman, B Popovic, M Smith, P Shah, M Sullivan, M Toy, TMC Abbott, M Aguena, S Allam, O Alves, J Annis, J Asorey, S Avila, D Bacon, D Brooks, DL Burke, A Carnero Rosell, D Carollo, J Carretero, LN da Costa, FJ Castander, S Desai, HT Diehl, J Duarte, S Everett, I Ferrero, B Flaugher, J Frieman, J García-Bellido, M Gatti, E Gaztanaga, G Giannini, K Glazebrook, S González-Gaitán, RA Gruendl, G Gutierrez, SR Hinton, DL Hollowood, K Honscheid, DJ James, K Kuehn, O Lahav, S Lee, GF Lewis, H Lin, JL Marshall, J Mena-Fernández, R Miquel, J Myles, RC Nichol, RLC Ogando, A Palmese, MES Pereira, A Pieres, AA Plazas Malagón, A Porredon, AK Romer, E Sanchez, D Sanchez Cid, I Sevilla-Noarbe, E Suchyta, MEC Swanson, G Tarle, BE Tucker, DL Tucker, V Vikram, AR Walker, N Weaverdyck

The Dark Energy Survey Supernova Program: Cosmological Analysis and Systematic Uncertainties

The Astrophysical Journal American Astronomical Society 975:1 (2024) 86

Authors:

M Vincenzi, D Brout, P Armstrong, B Popovic, G Taylor, M Acevedo, R Camilleri, R Chen, TM Davis, J Lee, C Lidman, SR Hinton, L Kelsey, R Kessler, A Möller, H Qu, M Sako, B Sanchez, D Scolnic, M Smith, M Sullivan, P Wiseman, J Asorey, BA Bassett, D Carollo, A Carr, RJ Foley, C Frohmaier, L Galbany, K Glazebrook, O Graur, E Kovacs, K Kuehn, U Malik, RC Nichol, B Rose, BE Tucker, M Toy, DL Tucker, F Yuan, TMC Abbott, M Aguena, O Alves, SS Allam, F Andrade-Oliveira, J Annis, D Bacon, K Bechtol, GM Bernstein, D Brooks, DL Burke, A Carnero Rosell, J Carretero, FJ Castander, C Conselice, LN da Costa, MES Pereira, S Desai, HT Diehl, P Doel, I Ferrero, B Flaugher, D Friedel, J Frieman, J García-Bellido, M Gatti, G Giannini, D Gruen, RA Gruendl, DL Hollowood, K Honscheid, D Huterer, DJ James, N Kuropatkin, O Lahav, S Lee, H Lin, JL Marshall, J Mena-Fernández, F Menanteau, R Miquel, A Palmese, A Pieres, AA Plazas Malagón, A Porredon, AK Romer, A Roodman, E Sanchez, D Sanchez Cid, M Schubnell, I Sevilla-Noarbe, E Suchyta, MEC Swanson, G Tarle, C To, AR Walker, N Weaverdyck, M Yamamoto

Late-time supernovae radio re-brightening in the VAST pilot survey

Monthly Notices of the Royal Astronomical Society Oxford University Press (OUP) 534:4 (2024) 3853-3868

Authors:

Kovi Rose, Assaf Horesh, Tara Murphy, David L Kaplan, Itai Sfaradi, Stuart D Ryder, Robert J Aloisi, Dougal Dobie, Laura Driessen, Rob Fender, David A Green, James K Leung, Emil Lenc, Hao Qiu, David Williams-Baldwin

Retrieval of the physical parameters of galaxies from WEAVE-StePS-like data using machine learning

Astronomy and Astrophysics EDP Sciences 690 (2024) A198

Authors:

J Angthopo, B Granett, F La Barbera, M Longhetti, A Iovino, M Fossati, Chiara Spiniello, Gavin Dalton, S Jin

Abstract:

Context

The William Herschel Telescope Enhanced Area Velocity Explorer (WEAVE) is a new, massively multiplexing spectrograph that allows us to collect about one thousand spectra over a 3 square degree field in one observation. The WEAVE Stellar Population Survey (WEAVE-StePS) in the next 5 years will exploit this new instrument to obtain high-S/N spectra for a magnitude-limited (IAB = 20.5) sample of ∼25 000 galaxies at moderate redshifts (z ≥ 0.3), providing insights into galaxy evolution in this as yet unexplored redshift range.

Aims

We aim to test novel techniques for retrieving the key physical parameters of galaxies from WEAVE-StePS spectra using both photometric and spectroscopic (spectral indices) information for a range of noise levels and redshift values.

Methods

We simulated ∼105 000 galaxy spectra assuming star formation histories with an exponentially declining star formation rate, covering a wide range of ages, stellar metallicities, specific star formation rates (sSFRs), and dust extinction values. We considered three redshifts (i.e. z = 0.3, 0.55, and 0.7), covering the redshift range that WEAVE-StePS will observe. We then evaluated the ability of the random forest and K-nearest neighbour algorithms to correctly predict the average age, metallicity, sSFR, dust attenuation, and time since the bulk of formation, assuming no measurement errors. We also checked how much the predictive ability deteriorates for different noise levels, with S/NI,obs = 10, 20, and 30, and at different redshifts. Finally, the retrieved sSFR was used to classify galaxies as part of the blue cloud, green valley, or red sequence.

Results

We find that both the random forest and K-nearest neighbour algorithms accurately estimate the mass-weighted ages, u-band-weighted ages, and metallicities with low bias. The dispersion varies from 0.08–0.16 dex for age and 0.11–0.25 dex for metallicity, depending on the redshift and noise level. For dust attenuation, we find a similarly low bias and dispersion. For the sSFR, we find a very good constraining power for star-forming galaxies, log sSFR ≳ −11, where the bias is ∼0.01 dex and the dispersion is ∼0.10 dex. However, for more quiescent galaxies, with log sSFR ≲ −11, we find a higher bias, ranging from 0.61 to 0.86 dex, and a higher dispersion, ∼0.4 dex, depending on the noise level and redshift. In general, we find that the random forest algorithm outperforms the K-nearest neighbours. Finally, we find that the classification of galaxies as members of the green valley is successful across the different redshifts and S/Ns.

Conclusions

We demonstrate that machine learning algorithms can accurately estimate the physical parameters of simulated galaxies for a WEAVE-StePS-like dataset, even at relatively low S/NI, obs = 10 per Å spectra with available ancillary photometric information. A more traditional approach, Bayesian inference, yields comparable results. The main advantage of using a machine learning algorithm is that, once trained, it requires considerably less time than other methods.