Abstract

Test and evaluation is a process that is used to determine if a product/system satisfies its performance specifications across its entire operating regime. The operating regime is typically defined using factors such as types of terrains/sea-states/altitudes, weather conditions, operating speeds, etc., and involves multiple performance metrics. With each test being expensive to conduct and with multiple factors and performance metrics under consideration, design of a test and evaluation schedule is far from trivial. Design of experiments (DOE) still continues to be the most prevalent approach to derive the test plans, although there is significant opportunity to improve this practice through optimization. In this paper, we introduce a surrogate-assisted optimization approach to uncover the performance envelope with a small number of tests. The approach relies on principles of decomposition to deal with multiple performance metrics and employs bi-directional search along each reference vector to identify the best and worst performance simultaneously. To limit the number of tests, the search is guided by multiple surrogate models. At every iteration the approach delivers a test plan involving at most \(K_T\) tests, and the information acquired is used to generate future test plans. In order to evaluate the performance of the proposed approach, a set of scalable test functions with various Pareto front characteristics and objective space bias are introduced. The performance of the approach is quantitatively assessed and compared with two popular DOE strategies, namely Latin Hypercube Sampling (LHS) and Full Factorial Design (FFD). Further, we also demonstrate its practical use on a simulated catapult system.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call