Abstract

This paper overviews the initial results of a new project at the Oak Ridge National Laboratory, supported via an internal seed funding program, to develop a novel computational capability for model validation: MAPPER. MAPPER will eliminate the need for empirical criteria such as the similarity indices often employed to identify applicable experiments for given application conditions. To achieve this, MAPPER uses an information-theoretic approach based on the Kullback-Leibler (KL) divergence principle to combine responses of available or planned experiments with application responses of interest. This is accomplished with a training set of samples generated using randomized experiment execution and application of high-fidelity analysis models. These samples are condensed using reduced order modeling techniques in the form of a joint probability distribution function (PDF) connecting each application response of interest with a new effective experimental response. MAPPER’s initial objective will be to support confirmation of criticality safety analysis of storage facilities which require known keff biases for safe operation. This paper reports some of the initial results obtained with MAPPER as applied to a set of critical experiments for which existing similarity-based methods have been shown to provide inaccurate estimates of the biases.

Highlights

  • Validation of computation method or model is a key requirement for all engineering applications, requiring the analyst to provide proof that models can accurately predict real behavior based on the available body of experiments and computer analysis results

  • The calculated keff values were used in training the MAPPER algorithm

  • This paper discusses the initial results for a prototypic implementation of the MAPPER sequence, a new tool within SCALE which was developed to support model validation by addressing the challenges of existing similarity-based and calibration-based techniques

Read more

Summary

INTRODUCTION

Validation of computation method or model is a key requirement for all engineering applications, requiring the analyst to provide proof that models can accurately predict real behavior based on the available body of experiments and computer analysis results. It has been observed that the bias could change dramatically upon adding/removing some experiments with low relevance, and the bias could be overconfidently estimated with small uncertainties, causing the results to show sensitivities to the experiments used Explanation for this complex behavior has not been recorded in the nuclear engineering literature, which has diminished the value of calibration-based techniques. The similarity index, which is a single number, can be derived from this PDF after making several simplifying assumptions, such as linearity, Gaussian uncertainties, and PDF marginalization over the common sources of uncertainties only This PDF provides a natural approach to mapping the biases between the experimental and application domains by marginalizing biases over the PDF describing the measured response. The degree of correlation between the experiment and the application can be quantitatively measured using the concept of mutual information, which provides an acceptable approach to optimize and select the experiments that are best correlated with the application of interest

DETAILS OF IMPLEMENTATION
APPLICATION
Critical Experiments
RESULTS
CONCLUSIONS
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.