Abstract

This paper reports the result of an analysis of wind tunnel data acquired in support of the Facility Analysis Verification & Operational Reliability (FAVOR) project. The analysis uses methods referred to collectively at Langley Research Center as the Modern Design of Experiments (MDOE). These methods quantify the total variance in a sample of wind tunnel data and partition it into explained and unexplained components. The unexplained component is further partitioned in random and systematic components. This analysis was performed on data acquired in similar wind tunnel tests executed in four different U.S. transonic facilities. The measurement environment of each facility was quantified and compared. I. Introduction his paper presents an analysis of data acquired in support of the Facility Analysis Verification and Operational Reliability (FAVOR) project, in which similarities and differences among four US transonic wind tunnels were studied by executing nominally similar test matrices in each facility on the same test article, balance, and sting. The participating tunnels were the National Transonic Facility at Langley Research Center (LaRC), the 11-Ft Unitary Plan wind tunnel at Ames Research Center (ARC), the 16T wind tunnel at the Arnold Engineering and Development Center (ARDC), and the 8x6-Foot supersonic wind tunnel at Glenn Research Center (GRC). The test article was the AEDC 16T check standard model, a 5% scale model of an F-111. The stated objective of the FAVOR project was to compare test methods, techniques, and procedures, as well as data reduction methods, flow quality, and aerodynamic data acquired across the four facilities in nominally identical wind tunnel tests. In support of these objectives, the NASA Aeronautics Test Program Office requested an independent analysis of the FAVOR data featuring techniques that are commonly employed in formal experiment design applications. The specific request is for an analysis of the data that utilizes methods referred to collectively at Langley Research Center as the Modern Design of Experiments 1-4 (MDOE). While the word “design” features prominently in the name of this experimental methodology, it actually consists of unified experiment design, execution, and analysis processes. The FAVOR tests were not designed or executed according to MDOE principles, but aspects of the MDOE analysis method can still be applied to the data. The objective nature of MDOE analytical methods is especially attractive when the analysis could be influenced to some degree by subjective a-priori expectations of the result. It is more difficult for such expectations to impact an analysis based on prescribed computations and quantitative inference rules as an MDOE analysis is, than it is for them to impact a conventional analysis that may be more open to subjective interpretation. All parties using the specific MDOE methods employed in the FAVOR analysis reported in this document, with the same sample of data, will produce the identical result. Multiple factor effects are partitioned by MDOE through an analysis of variance (ANOVA), which will be demonstrated in this report using the FAVOR data. While FAVOR was executed as a conventional One Factor At a Time (OFAT) test, the ANOVA method of partitioning factor effects can still be illustrated using replicated polars, for which time serves as a hidden second variable. Response changes that occur with time are especially relevant quality considerations that represent a key to improving facility performance, as will be discussed presently. Such changes can be responsible for a systematic component of the unexplained variance in a wind tunnel test that can dominate the more widely recognized random component. This systematic unexplained variance is attributable to factors that do not always reproduce precisely from facility to facility.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call