Data-efficient learning of exchange-correlation functionals with differentiable DFT
Mach. Learn.: Sci. Technol. 7 025001 (2026)
Abstract:
Machine learning (ML) density functional approximations (DFAs) have seen a lot of interest in recent years, often being touted as the replacement for well-established non-empirical DFAs, which still dominate the field. Although highly accurate, ML-DFAs typically rely on large amounts of data, are computationally expensive, and fail to generalize beyond their training domain. In this work we show that differentiable DFT with Kohn–Sham regularization can be used to accurately capture the behavior of known local density approximations from small sets of synthetic data without using localized density information. At the same time our analysis shows a strong dependence of the learning on both the amount and type of data as well as on model initialization. By enabling accurate learning from sparse energy data, this approach paves the way towards the development of custom ML-DFAs trained directly on limited experimental or high-level quantum chemistry datasets.
Data-efficient learning of exchange-correlation functionals with differentiable DFT
Machine Learning: Science and Technology IOP Publishing (2026)
Abstract:
<jats:title>Abstract</jats:title> <jats:p>Machine learning (ML) density functional approximations (DFAs) have seen a lot of interest in recent years, often being touted as the replacement for well-established non-empirical DFAs, which still dominate the field. Although highly accurate, ML-DFAs typically rely on large amounts of data, are computationally expensive, and fail to generalize beyond their training domain. In this work we show that differentiable DFT with Kohn-Sham (KS) regularization can be used to accurately capture the behaviour of known local density approximations (LDA) from small sets of synthetic data without using localized density information. At the same time our analysis shows a strong dependence of the learning on both the amount and type of data as well as on model initialization. By enabling accurate learning from sparse energy data, this approach paves the way towards the development of custom ML-DFAs trained directly on limited experimental or high-level quantum chemistry datasets.</jats:p>Suppression of pair beam instabilities in a laboratory analogue of blazar pair cascades
Proceedings of the National Academy of Sciences National Academy of Sciences 122:45 (2025) e2513365122
Abstract:
The generation of dense electron-positron pair beams in the laboratory can enable direct tests of theoretical models of γ-ray bursts and active galactic nuclei. We have successfully achieved this using ultrarelativistic protons accelerated by the Super Proton Synchrotron at (CERN). In the first application of this experimental platform, the stability of the pair beam is studied as it propagates through a meter-length plasma, analogous to TeV γ-ray-induced pair cascades in the intergalactic medium. It has been argued that pair beam instabilities disrupt the cascade, thus accounting for the observed lack of reprocessed GeV emission from TeV blazars. If true, this would remove the need for a moderate strength intergalactic magnetic field to explain the observations. We find that the pair beam instability is suppressed if the beam is not perfectly collimated or monochromatic, hence the lower limit to the intergalactic magnetic field inferred from γ-ray observations of blazars is robust.Modeling partially-ionized dense plasma using wavepacket molecular dynamics
(2025)
Learning heat transport kernels using a nonlocal heat transport theory-informed neural network
Physical Review Research American Physical Society (APS) 7:4 (2025) L042017