Abstract

In the present article, we detail the method used to experimentally determine the power of the CROCUS zero-power reactor, and to subsequently calibrate its ex-core monitor fission chambers. Knowledge of the reactor power is a mandatory quantity for a safe operation. Furthermore, most experimental research programs rely on absolute fission rates in design and interpretation – for instance, tally normalization of reaction rate studies in dosimetry, or normalization of power spectral density in neutron noise measurements. The minimization of associated uncertainties is only achieved by an accurate power determination method. The main experiment consists in the irradiation, and therefore, the activation of several axially distributed Au-197 foils in the central axis of the core, which activities are measured with a High-Purity Germanium (HPGe) gamma spectrometer. The effective cross sections are determined by MCNP and Serpent Monte Carlo simulations. We quantify the reaction rate of each gold foil, and derive the corresponding fission rate in the reactor. The variance weighted average over the distributed foils then provides a calibration factor for the count rates measured in the fission chambers during the irradiation. We detail the calibration process with minimization of respective uncertainties arising from each sub-step, from power control after reactivity insertion, to the calibration of the HPGe gamma spectrometer. Biases arising from different nuclear data choices are also discussed.

Highlights

  • THE CROCUS reactor is a two-zone, uranium-fuelled light water moderated facility operated by the Laboratory for Reactor Physics and Systems Behaviour (LRS) at the Swiss Federal Institute of Technology Lausanne (EPFL)

  • Most experimental research programs rely on absolute fission rates for design and interpretation [20] – for instance, tally normalization of reaction rate studies in dosimetry, or normalization of power spectral density in neutron noise measurements

  • The experimentally determined reaction rates were averaged by weighting in order to calculate a single calibration factor for each monitor and irradiation

Read more

Summary

Introduction

With a maximum power of 100 W, it is a zero-power reactor used for teaching and research purposes, most recently for studies on intrinsic and induced neutron noise, highly-localized measurements, and nuclear data [1]–[19]. Most experimental research programs rely on absolute fission rates for design and interpretation [20] – for instance, tally normalization of reaction rate studies in dosimetry, or normalization of power spectral density in neutron noise measurements. We present hereafter the method used to determine the reactor power and to subsequently calibrate the ex-core monitor fission chambers [21]–[23]. We detail the calibration process with minimization of respective uncertainties arising in each sub-step, from power control after reactivity insertion, to the calibration of the HPGe gamma spectrometer. Biases arising from different nuclear data choices are discussed

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call