Abstract

A recent paper in the journal by Jones and Johnson1 describes the use of designed experiments with computer models for product design and development activities. In recent years we have seen a dramatic increase in the use of computer models for engineering design activities. Some typical applications include computational fluid dynamics models to study turbulent flow, finite element analysis models to investigate the distribution of stresses in systems, and computer design tools for electrical circuits and components. Computer models are also widely used in weather forecasting and the study of hurricanes. These applications are continuing to grow and the applications environments are expanding. Researchers can use these models as substitutes for the physical systems that they represent because if they are properly calibrated and validated, experiments on the model can provide basically the same information as experiments on the real physical system at a fraction of the time and cost. Many of these models are deterministic; that is, they are systems of complex equations without any random components; hence, many of the traditional principles of designing experiments such as randomization, replication, and blocking do not directly apply. In many cases the traditional low-order polynomial models that we typically associate with the designed experiments are not appropriate either, because the underlying system is very complex. The Gaussian process model, a spatial correlation model, is often used to fit the data from a computer experiment. Space-filling designs such as the Latin hypercube are used instead of the familiar factorial and response surface designs. This is a dynamic area of research and development in the experimental design field; interaction between engineers, scientists, and experimental design experts will produce new methods and techniques. There is an another class of computer models not discussed in the Jones and Johnson paper. These are stochastic simulations, that is, computer simulations driven by streams of random numbers producing output responses that can be viewed as random variables. Discrete-event simulations, typical of these models, have been widely used since the 1960s to study factory scheduling problems, hospital operating room activities, and supply chain operations, just to name a few applications. They are also being used increasingly to study many types of transactional and service operations. These computer models also present challenges to the experimental designer. Often they are very high-fidelity models with many input factors; hence, traditional experimental approaches to factor screening and optimization may not be directly applicable. The output may be a time series or some type of functional response; hence, traditional low-order polynomials may not be appropriate. Sometimes these models include both controllable factors and uncontrollable or ‘noise’ factors, hence, some type of robust design or process robustness analysis may be required. Designing experiments for both types of computer model present significant challenges. Quality and Reliability Engineering International welcomes papers, including applications and case studies, on all aspects of these problems, illustrating how computer model and designed experiments can be effectively deployed to solve a significant practical problem, as well as methodological papers on aspects of design.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call