Abstract

Building simulation models that are valid and credible is an enduring challenge in the Australian Defence Organisation (ADO) context. Establishing validity and credibility can be achieved through the rigorous use of appropriate Verification, Validation, and Accreditation (VVA) processes. Such processes are well-known in modeling and simulation (M&S) practice. However, these are generally not applied within the ADO, typically due to resourcing concerns and a lack of authoritative guidance. Even if there are any, due to security concerns and commercial reasons, the application of M&S within ADO is generally not published in open-access platforms. Depending on where in the M&S life-cycle VVA is started, it may also serve a secondary aim of risk reduction, assisting in the early discovery of possible problems or mistakes. This research reviews current VVA practices from academic literature and recommends processes that are appropriate for application to combat simulation tools within the ADO context. A scoping review has been conducted to gather insight into current VVA practice in the M&S community. The results of this review are presented in the form of charting relevant characteristics from selected references. The scoping review shows that executable validation of simulation results against referent data sourced from physical experiments is the most prevalent form of VVA, with referent data from comparative models being a prevalent alternative. Furthermore, there is evident reliance on graphical comparison of data; this could be enhanced with objective data comparators, such as aggregate error measures or statistical techniques. Finally, there is an evident gap in VVA references from Australia, which could be addressed through the propagation and reporting of prevalent VVA practices within the ADO context.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call