Abstract

Since April 1989, when I became the director of the Ada Joint Program Office (AJPO), a lot of vendors have expressed their concerns to me regarding how their compilers are evaluated and how to stabilize the ACVC (Ada Compiler Validation Capability) test suite. While I have some strong opinions about the stability of the ACVC test suite, especially with the onset of Ada 9X, I will limit this position statement to evaluation issues.Besides the DoD, other federal agencies, such as NASA and the FAA, have become increasingly interested in compiler evaluation technology.Several issues share the forefront today, but the one that seems most controversial is whether the DoD will mandate compiler evaluation as it does conformance testing. As most of you know, in order to be validated by the Ada Joint Program Office, all Ada compilers must pass the ACVC test suite, which is designed to ensure that the compilers conform to the Ada Language Reference Manual (MIL-STD-1815A). Will the DoD set up the same kind of system for evaluating compilers? Or will it continue to leave the decision to evaluate compilers up to the individual program offices? Assuming some sort of evaluation testing becomes either mandatory or at least more widely used throughout the community, what format and mechanism will be used for conducting the tests?The AJPO's current policy-related choices are as follows: a centralized government-run and government-controlled facility for centralized evaluation testing, somewhat similar to conformance testing with ACVC; ora system whereby vendors have “public access” to an evaluation tool and conduct their own evaluations, reporting the results to respective government program offices. This system would allow government representatives to randomly check vendors' results.Even though Ada was designed with portability and uniformity in mind, numerous application-dependent criteria have made the scope of Ada compilation systems very wide indeed. This range includes compilation systems available for less than one thousand dollars as well as those available for well over one hundred thousand dollars. While the costs of a thorough compiler evaluation can be considerable, the costs of an inadequate evaluation, by comparison, can be astronomical. The program office incurs costs both in terms of the disastrous effects on the system's development, and the ultimate risk of the inadequate compiler being utilized in production versions of the weapon system. Furthermore, an inefficient runtime system can be particularly difficult to work with and requires unnecessarily high support. Ultimately, the test suite should be broad enough to address not only portability features, as most evaluation tools do now, but also an Ada compiler's real-time performance.For most embedded weapon systems, program developers must adapt target resident software modules and provide access to an on-board timer and serial controller for downloading the appropriate host-target I/O. This usually requires test software and possibly analysis software in order to glean the necessary test metrics for an effective evaluation.Regardless of which mechanism for conducting evaluation testing is chosen, I or II above, a program office must invest a considerable amount of time and staff to planning how to conduct an evaluation. One of the thorniest decisions is when to conduct an evaluation in the system's life cycle. Evaluating the compiler too early in its life cycle could yield useless results if the program office later changes the weapon system design. Conversely, waiting too long without completing an evaluation limits the office's flexibility. If it is considering competing compilers, it would obviously want the most complete and appropriate compiler evaluation data prior to making its selection.In conclusion, the decision to choose alternative I or II will depend on a number of non-technical factors: timeliness of evaluations to be completed,expected cost of evaluations under each scheme,quality control achievable under either alternative, andqualifications of the testers and their familiarity with the end application.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.