Abstract

Due to the complexity of nuclear reaction models, current nuclear data evaluations must rely on experimental observations to constrain models and provide the accuracy needed for applications. For criticality applications, the accuracy of nuclear data needed is higher than what is currently possible from differential experiments alone, and integral measurements are often used for data adjustment within the uncertainties of differential experiments. This approach does not necessarily result in physically correct cross sections or other adjusted quantities because compensation between different materials is hard to avoid. One of the objectives of the recent CIELO project [M. Chadwick et al., Nucl. Data Sheets 118, 1 (2014)] was simultaneous evaluation of important materials in an attempt to minimize the effects of compensation. Improvement to the evaluation process depends on obtaining new experimental data with high accuracy and lower uncertainty that will help constrain the evaluations for certain important reactions. Improved experiments are accomplished by careful design with the objective of achieving high accuracy and lower uncertainty, and by designing new innovative experiments. New and unconventional experiments do not necessarily provide differential data but instead nuclear data that evaluators will find useful to constrain the evaluation and reduce the uncertainty. This also means that closer information exchange and collaboration between experimentalists and evaluators is important. For conventional experiments such as neutron transmission or capture measurements, it is important to understand the sources of uncertainty and address them in the experiment design. Such a process can also lead to the design of innovative methods. For example, the filtered beam method minimizes uncertainties due to background, and the Quasi-Differential Neutron Scattering method simplifies the experiment and data analysis and results in lower experimental uncertainty. A review of the sources of uncertainty in various experiments and examples of experimental techniques that help reduce experimental and evaluation uncertainty and increase accuracy will be discussed.

Highlights

  • Accurate nuclear data is required for accurate calculation of nuclear reactors, criticality safety, shielding, and other applications

  • The term conventional refers to established methods; for example, measurements of neutron transmission and capture documented in reference [4]

  • The transmission can be calculated as the ratio of the background corrected sample-in to sample-out count rate for each TOF bin

Read more

Summary

Introduction

Accurate nuclear data is required for accurate calculation of nuclear reactors, criticality safety, shielding, and other applications. Nuclear physics models can calculate some quantities such as average cross sections but to obtain the required accuracy for the above applications experimental data must be used to bound the models. For example; resonance region evaluations are based on shape fitting of measured resonance data, and accurate and precise experiments are required to reduce uncertainties. Current nuclear data evaluated libraries are well established and are a topic of strong collaboration between different evaluation groups across the world (for example the CIELO [1] project and the WPEC collaboration [2]). Past experiments are documented in the EXFOR data base [3], which is extensively used by evaluators. When new experiments are considered the reason should be well justified including target accuracy.

Innovative experiments
Background reduction
Transmission normalization
Neutron scattering
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call