Even in Antarctica, scientists have to deal with a little dust in their detectors.
The IceCube Neutrino Observatory is an array of over 5,000 optical sensors embedded in a cubic kilometer of ice at the South Pole. Ice has ideal properties for detecting neutrinos, so IceCube physicists took advantage of Antarctica’s abundant supply to construct their state-of-the-art neutrino observatory.
But this comes with its own set of challenges. Most particle physics experiments are built with materials that can be characterized in the lab before installation. Meanwhile, the naturally occurring ice in which IceCube is embedded has impurities that reflect the climate history and atmospheric composition of the Earth. These impurities—all of which are referred to as “dust,” even though mineral dust is only one major component—affect how light travels through the IceCube detector and thus how the neutrino interactions appear.
In a technical paper submitted to the Journal of Cosmology and Astroparticle Physics earlier this week, the IceCube Collaboration presents a new method to understand the optical properties of the ice, called the SnowStorm method. More specifically, the paper explores how SnowStorm might affect high-precision analyses, including upcoming sterile neutrino results.
As IceCube collects more data, it can produce increasingly precise measurements of the behavior of neutrinos. In order to make these measurements, though, it is mandatory that researchers gain a better understanding of the way the detector responds. Some of the major uncertainties in the high-precision study of neutrinos in IceCube involve understanding the properties of the glacial ice at the South Pole.
One large source of systematic uncertainty is dust distribution in IceCube. This is because the effective dust concentration as a function of depth is difficult to estimate; it would require simulations for neutrino events in every reasonable ice model configuration, which is at present computationally unfeasible. In addition, the dust concentration varies continuously as a function of depth, and the uncertainty on this continuous function needs to be constrained and propagated without introducing an overwhelming number of nuisance parameters.
So IceCube collaborators, led by University of Texas at Arlington (UTA) assistant professor of physics Ben Jones, developed SnowStorm, a fundamentally new way of simulating the IceCube experiment. Unlike the conventional Monte Carlo simulations, where scientists simulate events with different neutrino energies and angles, Jones and his collaborators randomized the detector’s optical properties in every simulated event. They then developed a mathematical framework to use those events to understand how they would affect measurements of neutrino properties.
“This is a new technique for particle physics experiments, with applicability beyond IceCube, and so we decided to publish it so that other experimental physics collaborations can put its power to use,” says Jones.
The SnowStorm method satisfied all the checks to which it was subjected. It has been adopted as the ice uncertainty model for the upcoming search for sterile neutrinos, in which hundreds of thousands of detected neutrinos will be used to test for the existence of hypothetical new neutrinos that have been suggested by other experiments, like the Liquid Scintillator Neutrino Detector (LSND) and MiniBooNE.
The IceCube group at UTA is now working to extend the use of these powerful techniques to other analyses in IceCube, including the high-energy astrophysical neutrino measurements. They also hope to extend their applicability to other sources of systematic uncertainty, such as detector efficiencies and neutrino flux properties.
info “Efficient propagation of systematic uncertainties from calibration to analysis with the SnowStorm method in IceCube,” The IceCube Collaboration: M. G. Aartsen et al. Submitted to the Journal of Cosmology and Astroparticle Physics arxiv.org/abs/1909.01530