The XENON experiment is a 3500kg liquid xenon detector to search for the elusive Dark Matter. Have a look at the description of our detection principle, our recent publications, some pictures, or materials for press contacts. Feel free to contact us with your questions.
Five members of the University of Zurich group participated at the 2019 Swiss-Austrian Physical Society Meeting in Zurich, Switzerland.
Adam Brown contributed with a poster on the XENONnT upgrades and status and Ricardo Peres on the software for the supernova early warning system:
Giovanni Volta, Michelle Galloway and Chiara Capelli contributed with talks on the general XENON1T results, the ongoing search for dark absorption and the analysis on high energy events respectively. The full talks are linked. Below a key slide from each talk is shown: the spin-independent elastic WIMP-nucleon scattering limit at 90% CL still are the most sensitive limits on WIMP dark matter. The motivation for light dark matter searches is becoming more and more pressing. And our reconstruction of single-site and multiple-site interactions for the neutrinoless double beta decay search significantly improves our capability to contribute to this exciting science channel.
Featuring several kilometers of cables, dozens of analog electronics modules, crates of purpose-built specialty computers, and backed by a small server farm, the XENON1T data acquisition system (DAQ) was designed to put our data onto disks. The XENON Collaboration recently published a technical paper on our DAQ in JINST, of course also available on arXiv.
The XENON1T detector measures light, which creates analog electrical signals in 248 independent photo-sensors. The DAQ is responsible for converting these analog signals to a digital, storage-ready format, deciding what types of aggregate signal indicate the presence of a physical interaction in the detector, and recording all the interesting data onto disk for later storage and analysis.
There are a couple novel aspects of this system. The first is that the data is streamed constantly from the readout electronics onto short-term storage, recording all signals above a single photo-electron with high (>93%) efficiency. This is different from a conventional data acquisition system, which usually would require certain hardware conditions to be met to induce acquisition, also called a trigger. We defer our trigger to the software stage, giving us a very low energy threshold.
The software trigger itself was implemented as a database query, which is another novel aspect of the system. Pre-trigger data was stored in a MongoDB NoSQL database and the trigger logic scanned the database looking for signals consistent with S1’s (light) and S2’s (charge). If the algorithm found a matching signal, it would retrieve all the nearby data from the database and write it to storage. Because of the speed of NoSQL databases, this worked the same in both dark matter search mode, where we record just a few counts per second, and calibration modes, where we could record hundreds of counts per second.
To complete the high-tech upgrade of our system, we also ran the user interface as a web service. This means the system could be controlled from laptops, smartphones, or tablets anywhere with a 4G connection, contributing to the high uptime of the detector.
The DAQ is currently being updated to double its capacity to read out the XENONnT detector, so stay tuned.
A talk on the measurement of the double electron capture half-life of xenon-124 with the XENON1T experiment was given at the Lepton Photon 2019 conference in Toronto in August 2019. Ethan Brown from Rensselaer Polytechnic Institute presented this exciting result, demonstrating the power of the ultra-low background in XENON1T. This yielded the measurement of the longest process ever directly observed at 1.8×10^22 years, a trillion times longer than the age of the Universe.
Some of the dark matter search results were also presented in this talk, advertising the incredible success of the XENON program and the science reach of the XENON1T experiment in rare event detection.
XENON1T recently released a preprint with new world-leading constraints on light dark matter particles.
The challenge of light dark matter
The XENON1T detector aims find the signals of dark matter bouncing off xenon atoms.
If such a collision happens, it produces two signals: a small light flash (S1), and a cloud of free electrons that can be drifted up and extracted out of the detector (S2).
Figure: How dark matter would make S1 and S2 signals in the XENON1T detector.
However, dark matter lighter than about six times the proton mass (6 GeV/c^2) cannot push the heavy xenon atoms (131 GeV/c^2) enough to make efficiently detectable S1s. XENON1T needs both S1 and S2 to accurately reconstruct where in the detector the event happened. The time between the S1 and S2 signals reveals the depth of the event. Events at the top and bottom edge of the detector are common due to radioactive backgrounds. If we cannot reject these events, dark matter searches will not be efficient. Thus, most strong constraints on light dark matter have, until now, come from different detectors, mostly using ultra-low temperature crystals made of Germanium, Silicone, or Calcium Tungstate.
The S2-only technique
XENON1T’s new preprints use an “S2-only analysis”, where events without S1s are still considered. Advances in detector construction and analysis techniques led to a thousand times lower background level than previously achieved in S2-only searches.
For example, the S2 electron cloud becomes broader as it drifts upward, like a drop of ink spreading out in water. The deeper the event, the broader the cloud, and the longer the S2 signal lasts. Thus XENON1T could reject most of the events at the top and bottom, even without the S1, by rejecting very short and very long S2 signals.
Most theorists predict that dark matter would collide with the heavy xenon nuclei and produce “nuclear recoils”. For these, the S2-only technique is sensitive to 2-3x lower energies than traditional analyses. Thus, we get improved constraints on light dark matter:
Figure: New XENON1T limits (black lines) on light dark matter. The colored lines show previous results, including other results from XENON1T in blue.
In some models, dark matter collides with electrons around the nucleus, and produces “electronic recoils”. These make much larger S2 signals than nuclear recoils of the same S1 size. S2-only searches thus improve the energy threshold for these models by as much as a factor of ten. Combined with the lower background, XENON1T’s S2-only results thus improve the constraints on such models by several orders of magnitude:
Figure: New XENON1T limits on scattering of dark matter on electrons. (The dashed line is the same analysis repeated with more conservative assumptions.)
For more information, please see our arXiv preprint at https://arxiv.org/abs/1907.11485.
Since the first release of dark matter search results based on the 1 tonne-year exposure of the XENON1T experiment, the collaboration has published more WIMP signal searches based on the same dataset. Those articles are usually written in a brief way and are focusing on the communication of the scientific results.
In order to give more details on the XENON1T dark matter analysis, we have previously published a paper focusing on the signal and background models and the statistical inference using this data. It has been complemented by a new article that reveals details on the challenges of detector characterization and data preparation before it is ready to be used for model building and statistical inference in order to make statements on dark matter.
The XENON1T experiment performed two science runs between October 2016 and February 2018, reaching a total data livetime of 279 days. During that time the detector had to be operated in a very stable mode in order to ensure undistorted signals. If some conditions change over time they have to be modeled over time in order to account for them in the take them into account during data analysis and include them into the models. One example for those changes are the ones at the photosensors. Each sensor has an individual amplification factor, i.e. gain, that is a function of the applied high voltage. few sensors developed malfunctions during the science runs because of which the amplification factor decreased over time or the voltage had to be reduced resulting in a sudden decreased of the amplification. Those variations are shown in red and black for two sensors as a function of time in the following figure while green, blue and magenta show stable sensors which are representative for the majority of the XENON1T light detectors.
As soon as the detector operation conditions are modeled the data is put through selection criteria that reduce the number of background-like signatures and therefore enhance the signal to background ratio. The criteria are grouped into four general types:
Modelling how an electronic or nuclear recoil will look like in the detector is crucial both to know the shape of a WIMP signal, and to model the backgrounds well. XENON1T uses a comprehensive fit to multiple calibration sources to constrain the distributions of backgrounds and signals in the analysis space; S1, S2 and the radius from the center axis of the detector. Some background components are harder to model directly, and are estimated by using sidebands or other data samples. In the XENON1T analysis, coincidences between unrelated, lone S1 and S2 events were modeled this way, in addition to the surface background– events occurring close to or at the detector wall.
The models of each background and the signal, for two separate science runs, are put together in a likelihood, which is a mathematical function of the WIMP signal strength as well as nuisance parameters. These are unknowns that could change the analysis, such as the true expectation value for each background component. The likelihood also contains multiple terms representing measurements of nuisance parameter, which constrain them when the likelihood is fitted to the data collected by XENON1T. The value of the likelihood evaluated at a specific signal strength has a random distribution which is estimated using simulated realizations of the experimental outcome. The final statistical limits are computed by comparing the likelihood computed on the actual data with the distributions found from the simulations:
XENON was on the agenda at the European Physical Society Conference on High Energy Physics 2019 (EPS-HEP2019), which was held in Ghent, Belgium in the middle of July. The talk, presented by Adam Brown from the University of Zurich group, concentrated on results from XENON1T and also provided an overview of the work which is well underway to build the next generation detector, XENONnT.
Among results shown were our searches for elastic WIMP scattering and the recently published observation of double electron capture in 124Xe. The slides can be downloaded here. While the XENONnT upgrade currently in progress at Gran Sasso features many improvements of the XENON1T detector, Adam summarized four major improvements in one colorful slide.
A talk on the XENON project was given at the 15th Patras Workshop on Axions, WIMPs and WISPs, which was held in Freiburg (Germany) in the first week of June. Andrea Molinario from the Gran Sasso Science Institute and Laboratori Nazionali del Gran Sasso presented the most recent results from the data analysis of XENON1T, in particular the search for WIMP-nucleon spin-dependent and spin-independent interactions. The sensitivity of this search will be much-improved upon by the upcoming XENONnT phase of the experiment.
The first observation of 124Xe double electron capture and the measurement of the half-life of the process were also shown (this topic had a dedicated talk by Sebastian Lindemann). In the second part of his talk, Andrea gave an update on the status of XENONnT. The presentation is available here.
Our latest XENON1T paper on details of our analysis was presented at the Low Radioactivity Techniques, a conference focused on low background experiments. In the talk (that you can find here), the response model of the detector, the challenges of background modeling, as well as the used techniques were described. In a low background experiment is often hard to asses the expected distribution of events due to lack of statistics and to many subtle effects. In the talk a novel technique was described to introduce a well-motivated systematic uncertainty to the background model based on a calibration sample, which can be relevant to other low background experiments.
The XENONnT dual-phase xenon TPC requires two regions with different electric fields to drift, extract, and accelerate the small number of ionization electrons that are created by a possible dark matter interaction with xenon nuclei. These fields will be created with a total of five electrodes that are biased at constant electric potentials from top to the bottom of the TPC. The challenges to build these large electrodes with almost 1.5 meters in diameter with very thin wires include stringent requirements on their optical transparency, wire sagging, field uniformity and high voltage stability.
Such a challenging project is carried out by a collaborative effort of many expertises within the XENON collaboration. The design and production of the electrodes are led by Dr. Carla Macolino and realized by researchers from the Laboratoire de l’Accélérateur Linéaire, Rice University, University of California San Diego, and University of Coimbra, accompanied by further technical design and electric field simulation support from University of Chicago and Freiburg University. A special instrument was designed and built by the University of Münster to measure the tension of every individual wire. Finally, strict cleaning requirement is satisfied from the expertise at MPI for Nuclear Physics and technical support from Nikhef.