Abstract

Abstract. An evaluation of a model's overall performance in simulating multiple fields is fundamental to model intercomparison and development. A multivariable integrated evaluation (MVIE) method was proposed previously based on a vector field evaluation (VFE) diagram, which can provide quantitative and comprehensive evaluation on multiple fields. In this study, we make further improvements to this method from the following aspects. (1) We take area weighting into account in the definition of statistics in the VFE diagram and MVIE method, which is particularly important for a global evaluation. (2) We consider the combination of multiple scalar fields and vector fields against multiple scalar fields alone in the previous MVIE method. (3) A multivariable integrated skill score (MISS) is proposed as a flexible index to measure a model's ability to simulate multiple fields. Compared with the multivariable integrated evaluation index (MIEI) proposed in the previous study, MISS is a normalized index that can adjust the relative importance of different aspects of model performance. (4) A simple-to-use and straightforward tool, the Multivariable Integrated Evaluation Tool (MVIETool version 1.0), is developed to facilitate an intercomparison of the performance of various models. Users can use the tool coded either with the open-source NCAR Command Language (NCL) or Python3 to calculate the MVIE statistics and plotting. With the support of this tool, one can easily evaluate model performance in terms of each individual variable and/or multiple variables.

Highlights

  • An increasing number of model intercomparison projects (MIPs) have been carried out over the past decade (Eyring et al, 2016; Simpkins, 2017)

  • Because vector mean error (VME) is a function of the difference in the vector magnitude and direction, we provide two additional statistical metrics – the mean error of vector magnitude (MEVM) and the mean error of vector direction (MEVD) – to separate the magnitude error from the directional error

  • To summarize and rank the overall performance of a model in simulating multiple fields, the multivariable integrated skill score (MISS) in both centered and uncentered modes are provided in the metrics table and are expected to provide a more accurate evaluation compared with multivariable integrated evaluation index (MIEI)

Read more

Summary

Introduction

An increasing number of model intercomparison projects (MIPs) have been carried out over the past decade (Eyring et al, 2016; Simpkins, 2017). Zhang et al.: MVIETool version 1.0 tegrated evaluation (MVIE) method to evaluate model performance in terms of multiple fields by grouping various normalized scalar fields into an integrated vector field. The MVIE method defined a multivariable integrated evaluation index (MIEI) to summarize the model’s overall performance in simulating multiple fields. Area weighting was considered in many previous statistical metrics, e.g., correlation coefficient and standard deviation, they were used to evaluate scalar fields rather than vector fields (e.g., Watterson, 1996; Boer and Lambert, 2001; Masson and Knutti, 2011). We develop a Multivariable Integrated Evaluation Tool (MVIETool version 1.0) to facilitate multimodel intercomparison These efforts are expected to improve the accuracy and flexibility of the VFE and MVIE methods.

Statistical metrics
Uncentered statistics
Centered statistics
MVIE with a combination of multiple scalar and vector fields
Brief overview
Preparing the input data
Usage and workflow of the MVIETool
Varname
Inputdatadir
Application of the tool
Metrics table
VFE diagram
Summary
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.