Skip to main content
Home
Department Of Physics text logo
  • Research
    • Our research
    • Our research groups
    • Our research in action
    • Research funding support
    • Summer internships for undergraduates
  • Study
    • Undergraduates
    • Postgraduates
  • Engage
    • For alumni
    • For business
    • For schools
    • For the public
Menu
Black Hole

Lensing of space time around a black hole. At Oxford we study black holes observationally and theoretically on all size and time scales - it is some of our core work.

Credit: ALAIN RIAZUELO, IAP/UPMC/CNRS. CLICK HERE TO VIEW MORE IMAGES.

Dr Harry Desmond

Visitor

Research theme

  • Astronomy and astrophysics
  • Particle astrophysics & cosmology

Sub department

  • Astrophysics

Research groups

  • Beecroft Institute for Particle Astrophysics and Cosmology
harry.desmond@physics.ox.ac.uk
Telephone: 01865(2)83019
ICG webpage
  • About
  • Publications

Galaxy morphology rules out astrophysically relevant Hu-Sawicki f (R) gravity

Physical Review D American Physical Society 102:10 (2020) 104060

Authors:

Pedro Ferreira, Harry Desmond

Abstract:

f ( R ) is a paradigmatic modified gravity theory that typifies extensions to General Relativity with new light degrees of freedom and hence screened fifth forces between masses. These forces produce observable signatures in galaxy morphology, caused by a violation of the weak equivalence principle due to a differential impact of screening among galaxies’ mass components. We compile statistical datasets of two morphological indicators—offsets between stars and gas in galaxies and warping of stellar disks—and use them to constrain the strength and range of a thin-shell-screened fifth force. This is achieved by applying a comprehensive set of upgrades to past work [H. Desmond et al., Phys. Rev. D 98, 064015 (2018); H. Desmond et al., Phys. Rev. D 98, 083010 (2018) ]: we construct a robust galaxy-by-galaxy Bayesian forward model for the morphological signals, including full propagation of uncertainties in the input quantities and marginalization over an empirical model describing astrophysical noise. Employing more stringent data quality cuts than previously we find no evidence for a screened fifth force of any strength Δ G / G N in the Compton wavelength range 0.3–8 Mpc, setting a 1 σ bound of Δ G / G N < 0.8 at λ C = 0.3     Mpc that strengthens to Δ G / G N < 3 × 10 − 5 at λ C = 8     Mpc . These are the tightest bounds to date beyond the Solar System by over an order of magnitude. For the Hu-Sawicki model of f ( R ) with n = 1 we require a background scalar field value f R 0 < 1.4 × 10 − 8 , forcing practically all astrophysical objects to be screened. We conclude that this model can have no relevance to astrophysics or cosmology.
More details from the publisher
Details from ORA
More details
Details from ArXiV

Local resolution of the Hubble tension: The impact of screened fifth forces on the cosmic distance ladder

Physical Review D American Physical Society (APS) 100:4 (2019) 043537

Authors:

Harry Desmond, Bhuvnesh Jain, Jeremy Sakstein
More details from the publisher
Details from ORA
More details
Details from ArXiV

The Velocity Field Olympics: Assessing velocity field reconstructions with direct distance tracers

Monthly Notices of the Royal Astronomical Society Oxford University Press (OUP) (2025) staf1960

Authors:

Richard Stiskalek, Harry Desmond, Julien Devriendt, Adrianne Slyz, Guilhem Lavaux, Michael J Hudson, Deaglan J Bartlett, Hélène M Courtois

Abstract:

Abstract The peculiar velocity field of the local Universe provides direct insights into its matter distribution and the underlying theory of gravity, and is essential in cosmological analyses for modelling deviations from the Hubble flow. Numerous methods have been developed to reconstruct the density and velocity fields at z ≲ 0.05, typically constrained by redshift-space galaxy positions or by direct distance tracers such as the Tully–Fisher relation, the fundamental plane, or Type Ia supernovae. We introduce a validation framework to evaluate the accuracy of these reconstructions against catalogues of direct distance tracers. Our framework assesses the goodness-of-fit of each reconstruction using Bayesian evidence, residual redshift discrepancies, velocity scaling, and the need for external bulk flows. Applying this framework to a suite of reconstructions—including those derived from the Bayesian Origin Reconstruction from Galaxies (BORG) algorithm and from linear theory—we find that the non-linear BORG reconstruction consistently outperforms others. We highlight the utility of such a comparative approach for supernova or gravitational wave cosmological studies, where selecting an optimal peculiar velocity model is essential. Additionally, we present calibrated bulk flow curves predicted by the reconstructions and perform a density–velocity cross-correlation using a linear theory reconstruction to constrain the growth factor, yielding S8 = 0.793 ± 0.035. The result is in good agreement with both weak lensing and Planck, but is in strong disagreement with some peculiar velocity studies.
More details from the publisher
More details

Constraints on dark matter annihilation and decay from the large-scale structure of the nearby Universe

Physical Review D American Physical Society 106:10 (2022) 103526

Authors:

DJ Bartlett, A Kostić, H Desmond, J Jasche, G Lavaux

Abstract:

Decaying or annihilating dark matter particles could be detected through gamma-ray emission from the species they decay or annihilate into. This is usually done by modeling the flux from specific dark matter-rich objects such as the Milky Way halo, Local Group dwarfs, and nearby groups. However, these objects are expected to have significant emission from baryonic processes as well, and the analyses discard gamma-ray data over most of the sky. Here we construct full-sky templates for gamma-ray flux from the large-scale structure within ∼200 Mpc by means of a suite of constrained N-body simulations (csiborg) produced using the Bayesian Origin Reconstruction from Galaxies algorithm. Marginalizing over uncertainties in this reconstruction, small-scale structure, and parameters describing astrophysical contributions to the observed gamma-ray sky, we compare to observations from the Fermi Large Area Telescope to constrain dark matter annihilation cross sections and decay rates through a Markov chain Monte Carlo analysis. We rule out the thermal relic cross section for s-wave annihilation for all mχ7 GeV/c2 at 95% confidence if the annihilation produces gluons or quarks less massive than the bottom quark. We infer a contribution to the gamma-ray sky with the same spatial distribution as dark matter decay at 3.3σ. Although this could be due to dark matter decay via these channels with a decay rate Γ≈6×10-28 s-1, we find that a power-law spectrum of index p=-2.75-0.46+0.71, likely of baryonic origin, is preferred by the data.
More details from the publisher
Details from ORA
More details
Details from ArXiV

Exhaustive symbolic regression

IEEE Transactions on Evolutionary Computation IEEE (2023)

Authors:

Deaglan Bartlett, Harry Desmond, Pedro Ferreira

Abstract:

Symbolic Regression (SR) algorithms attempt to learn analytic expressions which fit data accurately and in a highly interpretable manner. Conventional SR suffers from two fundamental issues which we address here. First, these methods search the space stochastically (typically using genetic programming) and hence do not necessarily find the best function. Second, the criteria used to select the equation optimally balancing accuracy with simplicity have been variable and subjective. To address these issues we introduce Exhaustive Symbolic Regression (ESR), which systematically and efficiently considers all possible equations&#x2014;made with a given basis set of operators and up to a specified maximum complexity&#x2014; and is therefore guaranteed to find the true optimum (if parameters are perfectly optimised) and a complete function ranking subject to these constraints. We implement the minimum description length principle as a rigorous method for combining these preferences into a single objective. To illustrate the power of ESR we apply it to a catalogue of cosmic chronometers and the Pantheon+ sample of supernovae to learn the Hubble rate as a function of redshift, finding 40 functions (out of 5.2 million trial functions) that fit the data more economically than the Friedmann equation. These low-redshift data therefore do not uniquely prefer the expansion history of the standard model of cosmology. We make our code and full equation sets publicly available.
More details from the publisher
Details from ORA
More details

Pagination

  • Current page 1
  • Page 2
  • Page 3
  • Page 4
  • Page 5
  • Page 6
  • Page 7
  • Page 8
  • Page 9
  • …
  • Next page Next
  • Last page Last

Footer Menu

  • Contact us
  • Giving to the Dept of Physics
  • Work with us
  • Media

User account menu

  • Log in

Follow us

FIND US

Clarendon Laboratory,

Parks Road,

Oxford,

OX1 3PU

CONTACT US

Tel: +44(0)1865272200

University of Oxfrod logo Department Of Physics text logo
IOP Juno Champion logo Athena Swan Silver Award logo

© University of Oxford - Department of Physics

Cookies | Privacy policy | Accessibility statement

Built by: Versantus

  • Home
  • Research
  • Study
  • Engage
  • Our people
  • News & Comment
  • Events
  • Our facilities & services
  • About us
  • Giving to Physics
  • Current students
  • Staff intranet