Abstract

A recent article in the journal by Ilzarbe et al.1 provided a nice survey and summary of papers employing designed experiments in engineering and science between 2001 and 2005. In total, 77 case studies were presented from a variety of engineering fields that the authors classified as materials, mechanical engineering, industrial engineering, electrical/electronic engineering, energy and other. The authors surveyed several core journals to find their case studies; The Journal of Applied Statistics, The Journal of Quality Technology, Quality and Reliability Engineering International, Quality Engineering, and Technometrics. A search was also made using the Web of Knowledge. The authors reported that about 58% of these papers were concerned with finding the important factors influencing the system, what we refer to as screening or characterization; 33% dealt with optimization and response surface methodology, and 9% involved robust parameter design. This is very consistent with my own experience as a consultant. Most engineers have little exposure to design of experiments (DOX) in their undergraduate academic training. It is typically part of another course that deals with several broad topics in engineering statistics, and DOX is only one item on the menu. About all that can be done is to introduce the student to the factorial concept and (maybe) two-level fractional factorials. Not all engineers are exposed to statistics (let alone DOX) at the undergraduate level, and this leads to problems when these tools are required in practice; either they don't know what type of experimental strategy is required for their problem and they select something inappropriate, or they select the correct strategy and apply it incorrectly, or they select the wrong strategy in which case it probably doesn't matter whether they use it correctly or not, or they resort to the time-honored engineering approach of varying one factor at a time. My experience says that industrial, electrical, and chemical engineers have the best statistical knowledge, while mechanical and many civil engineers have the worst. The mechanicals and civils often just don't get the idea of variability. All too often they think and live in a deterministic world, despite the evidence that surrounds them. Now I have to be careful here, because many civil engineers working in the areas of transportation and hydrology (for example) are quite knowledgeable about statistics, and use statistical modeling techniques very successfully. At the graduate level, things may be getting better. There is evidence that engineers in graduate programs get some exposure to DOX. For example, at Arizona State University our first-year graduate course in DOX has about 175 students from on-campus enrolled in the course (about 100 in the fall term, slightly fewer in the spring), and about 150 enrolled in the on-line version of the course (spread over three terms, fall, spring, and summer). These students come from electrical engineering, industrial engineering, material science and engineering, bioengineering, civil engineering, and mechanical engineering, with the proportion of students from each field in approximately that order. These are pretty impressive numbers. My academic colleagues at other universities tell me that their DOX course enrollments are growing. However, because this is a pretty basic course appealing to a wide audience that has uneven preparation in statistics, our course focuses on screening and characterization, mostly factorials and fractionals, and gives only overviews of response surfaces and robust design. If they want to know more about those topics, they must take another course, and some do. Ilzarbe et al. noted that the number of applications papers increased steadily from 2001 to 2005. Better university education may be responsible for part of this. Another component of this is industrial education in DOX sponsored by the employer. This can take the form of an in-house course just for company engineers and scientists, or a public course taught by a university, a consultant, or a professional society. The useful knowledge imparted to the participants in these courses varies considerably, depending on the instructor, the approach taken to the subject, how the material is taught, and the course/reference material provided. The increasing emphasis on six sigma has been a big factor in industrial education about DOX, because designed experiments play a very important role in the improve step of DMAIC, and in design for six sigma. Many feel that DOX is the most important of the six sigma tools. Even though it is recognized as an important capability, many six sigma trained black belts and master black belts have a very poor understanding of DOX principles. I have always found Quality Engineering to be an excellent source of case studies and applications papers on DOX, and I'm surprised that Ilzarbe et al. didn't find more papers there. For example, the recent volume of Quality Engineering has three papers that I would characterize as DOX case studies; Weaver and Hamada2, Das and Lee3, and Kovach and Cho4. In general, why don't we find more DOX case studies in the core journals mentioned above? Generally, it's because to be published in a core journal the case study has to have some new or novel feature. It just can't be a standard or routine application of the statistical science. Those papers should appear in a discipline-oriented science or engineering journal. For example, the Weaver and Hamada paper dealt with analyzing a designed experiment with Bayesian methods, and the model fit was the logistic regression model. This was a very instructive paper, even for readers familiar with DOX and logistic regression. Another very important area of DOX that was not discussed in much detail by Ilzarbe et al. is computer experiments. The use of computers in engineering design and analysis is expanding rapidly. Finite element models, computational fluid dynamics models, and electrical circuit design models are examples of an important class of deterministic computer model; that is, models for which the observed output response is the same if the values of the input parameters are held constant on repeated runs. This is an entirely different class of problem that the typical physical experiment, where the output response is a random variable. Traditional factorial and response surface designs are not always appropriate for these types of experiments, and in many cases low-order polynomial models do not adequately describe the response surface. There are also stochastic computer models, such as discrete-event simulation models used for the study of factory operations, queueing networks, and transactional operations. While the outputs of these models are stochastic, they are often not just a single number or even a vector of responses. They are often time series or functional data. There is much work to be done on developing methods for both the design and analysis of computer experiments. This is an area where applications of DOX in science and engineering will grow dramatically in the 21st century. Quality and Reliability Engineering International has papers forthcoming on these topics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call