Speaker
Description
One of the challenges facing the system-level design of the ATLAS ITk Strip Detector is the understanding of the TID induced leakage current in the chosen 130nm CMOS technology. The effect of ionizing radiation on the current increase of the ABC130 readout ASIC has been studied at various different dose rates and temperatures using x-ray tubes and Co-60 sources. In addition, the efficacy of pre-irradiation of chips and the variation across wafers and batches has been studied. The results shown here allow a better understanding of the effect of TID on final detector systems and associated system design.
Summary
Central to the design of any detector system is a detailed understanding of the current and power dissipation due to the implications on power supply design, thermal management and mechanical stability. It is well documented that certain 130nm CMOS technologies exhibit an increase in the leakage current when exposed to ionizing radiation. Such 130nm technology is employed in the readout ASICs of the ATLAS ITk Strip Tracker Upgrade. Using the so-called “ABC130” prototype chipset, measurements will be presented which allow us to parameterise the increase in current as a function of dose-rate and temperature in the region of the phase space most applicable to HL-LHC conditions. Studies investigating the batch-by-batch, chip-by-chip and wafer-by-wafer variation of the total current increase will also be presented which demonstrate a significant batch-by-batch variation alongside non-negligible variations within wafers. Furthermore, studies will be shown investigating the long-term annealing of irradiated chips (up to 4 months storing chips at 80°C) and the feasibility of pre-irradiating chips using a Co-60 source.