Time-embedded convolutional neural networks for modeling plasma heat transport
Physical Review E American Physical Society (APS) 113:3 (2026) 035303
Abstract:
We introduce a time-embedded convolutional neural network (TCNN) for modeling spatiotemporal heat transport in plasmas, particularly under strongly nonlocal conditions. In our earlier work, the Luciani-Mora-Virmont (LMV) Informed Neural Network (LINN) (Luo , ) combined prior knowledge from the LMV model with kinetic Particle-in-Cell (PIC) data to improve kernel-based heat-flux predictions. While effective under moderately nonlocal conditions, LINN produced physically inconsistent kernels in strongly time-dependent regimes due to its reliance on the quasistationary LMV formulation. To overcome this limitation, TCNN is designed to capture the coupled evolution of both the normalized heat flux and the characteristic nonlocality parameter using a unified neural architecture informed by underlying physical principles. Trained on fully kinetic PIC simulations, TCNN accurately reproduces nonlocal dynamics across a broad range of collisionalities. Our results demonstrate that the combination of time modulation, coupled prediction, and convolutional depth significantly enhances predictive performance, offering a data-driven yet physically consistent framework for multiscale plasma transport problems.
Data-efficient learning of exchange-correlation functionals with differentiable DFT
Mach. Learn.: Sci. Technol. 7 025001 (2026)
Abstract:
Machine learning (ML) density functional approximations (DFAs) have seen a lot of interest in recent years, often being touted as the replacement for well-established non-empirical DFAs, which still dominate the field. Although highly accurate, ML-DFAs typically rely on large amounts of data, are computationally expensive, and fail to generalize beyond their training domain. In this work we show that differentiable DFT with Kohn–Sham regularization can be used to accurately capture the behavior of known local density approximations from small sets of synthetic data without using localized density information. At the same time our analysis shows a strong dependence of the learning on both the amount and type of data as well as on model initialization. By enabling accurate learning from sparse energy data, this approach paves the way towards the development of custom ML-DFAs trained directly on limited experimental or high-level quantum chemistry datasets.
A statistical theory of electronic degrees of freedom in wave packet molecular dynamics
(2026)
Data-efficient learning of exchange-correlation functionals with differentiable DFT
Machine Learning: Science and Technology IOP Publishing (2026)
Abstract:
<jats:title>Abstract</jats:title> <jats:p>Machine learning (ML) density functional approximations (DFAs) have seen a lot of interest in recent years, often being touted as the replacement for well-established non-empirical DFAs, which still dominate the field. Although highly accurate, ML-DFAs typically rely on large amounts of data, are computationally expensive, and fail to generalize beyond their training domain. In this work we show that differentiable DFT with Kohn-Sham (KS) regularization can be used to accurately capture the behaviour of known local density approximations (LDA) from small sets of synthetic data without using localized density information. At the same time our analysis shows a strong dependence of the learning on both the amount and type of data as well as on model initialization. By enabling accurate learning from sparse energy data, this approach paves the way towards the development of custom ML-DFAs trained directly on limited experimental or high-level quantum chemistry datasets.</jats:p>Figure data: A statistical theory of electronic degrees of freedom in wave packet molecular dynamics
University of Oxford (2026)