Strong Constraints on Neutrino Nonstandard Interactions from TeV-Scale ν_{μ} Disappearance at IceCube.
Physical review letters 129:1 (2022) 011804
Abstract:
We report a search for nonstandard neutrino interactions (NSI) using eight years of TeV-scale atmospheric muon neutrino data from the IceCube Neutrino Observatory. By reconstructing incident energies and zenith angles for atmospheric neutrino events, this analysis presents unified confidence intervals for the NSI parameter ε_{μτ}. The best-fit value is consistent with no NSI at a p value of 25.2%. With a 90% confidence interval of -0.0041≤ε_{μτ}≤0.0031 along the real axis and similar strength in the complex plane, this result is the strongest constraint on any NSI parameter from any oscillation channel to date.The Higgs boson turns ten.
Nature 607:7917 (2022) 41-47
Abstract:
The discovery of the Higgs boson, ten years ago, was a milestone that opened the door to the study of a new sector of fundamental physical interactions. We review the role of the Higgs field in the Standard Model of particle physics and explain its impact on the world around us. We summarize the insights into Higgs physics revealed so far by ten years of work, discuss what remains to be determined and outline potential connections of the Higgs sector with unsolved mysteries of particle physics.Anomalous Higgs boson couplings in weak boson fusion production at NNLO in QCD
(2022)
Leveraging universality of jet taggers through transfer learning
The European Physical Journal C volume 82, Article number: 564 (2022)
Abstract:
A significant challenge in the tagging of boosted objects via machine-learning technology is the prohibitive computational cost associated with training sophisticated models. Nevertheless, the universality of QCD suggests that a large amount of the information learnt in the training is common to different physical signals and experimental setups. In this article, we explore the use of transfer learning techniques to develop fast and data-efficient jet taggers that leverage such universality. We consider the graph neural networks LundNet and ParticleNet, and introduce two prescriptions to transfer an existing tagger into a new signal based either on fine-tuning all the weights of a model or alternatively on freezing a fraction of them. In the case of W-boson and top-quark tagging, we find that one can obtain reliable taggers using an order of magnitude less data with a corresponding speed-up of the training process. Moreover, while keeping the size of the training data set fixed, we observe a speed-up of the training by up to a factor of three. This offers a promising avenue to facilitate the use of such tools in collider physics experiments.