THE IEEE High Level Design Validation and Test (HLDVT) Workshop was originally established to act as a focal point for research in verification and simulation. This special section on simulation-based design validation is based on the best papers in the Proceedings of HLDVT 2005. The focus areas of HLDVT are well represented by the papers in this special section, including formal verification and simulation-based functional verification research. Several papers in this special section focus on functional test generation, the process of generating test sequences for use during simulation. Boolean satisfiability is a fundamental solving technique which is applied to a wide range of design automation problems, including automatic test generation and model checking. An extension to SAT solving theory is presented in “B-Cubing: New Possibilities for Efficient SAT-Solving” by Domagoj Babi c, Jesse Bingham, and Alan J. Hu, which generalizes the supercubing framework presented in earlier work. This paper proves the correctness of the proposed B-cubing technique using a new construct created for that purpose, the obligation-certification tree. “A New Simulation-Based Property Checking Algorithm Based on Partitioned Alternative Search Space Traversal” by Qingwei Wu and Michael S. Hsiao describes a test generation tool which is used to search for violations of safety properties. A genetic algorithm is used to generate test sequences, together with a Boolean constraint propagation engine which reduces the search space. “SimulationBased Functional Test Generation for Embedded Processors” by Charles H.-P. Wen, Li-C. Wang, and Kwang-Ting Cheng presents a functional test generation technique which uses simulation results to reduce the complexity of the search process. When test generation is performed for a component under test, the justification process must involve many additional surrounding components which are involved in fault activation and propagation. The technique presented in this paper simulates surrounding components and uses machine learning to generate a “functional mapping” which represents the inverse function of the component and is used during justification. The functional mapping is approximate, but it enables an efficient solution to the justification problem. In the paper “Harnessing Machine Learning to Improve the Success Rate of Stimuli Generation” by Shai Fine, Ari Freund, Itai Jaeger, Yishay Mansour, Yehuda Naveh, and Avi Ziv, researchers apply learning theory to establish a relationship between the start state of the search process and the success rate of test generation. The choice of start state is important because it has a strong impact on search complexity. Once a design error has been revealed by simulation, there remains the task of identifying the source cause of the error by examining the error trace. In the paper “An Optimum Algorithm for Compacting Error Traces for Efficient Design Error Debugging” by Chian-Chih Yen and Jing-Yang Jou, a technique is presented to reduce the length of an existing error trace in order to make it more easy to understand and examine. Two heuristics are presented to modify the detecting input sequence to reach the same error detecting state with fewer state machine transitions. The functional verification process often depends on the use of coverage metrics to evaluate the completeness of the verification process. Coverage metrics define a set of coverage events whose occurrence during testing is required to reveal design errors. The paper “Advanced Analysis Techniques for Cross-Product Coverage” by Hezi Azatchi, Laurent Fournier, Eitan Marcus, Shmuel Ur, Avi Ziv, and Keren Zohar evaluates the use of crossproduct coverage metrics which combine to monitor the occurrence of events which are defined by the cross-product of two or more sets of independent coverage events. The paper explores the relationship between a variety of crossproduct coverage techniques and the quality of verification. Two of the papers in the special section address the verification of essential but nontraditional aspects of design correctness. “Multilevel Design Validation in a Secure Embedded System” by Parick Schaumont, David Hwang, Shenglin Yang, and Ingrid Verbauwhede describes a simulation-based methodology to validate the security properties of an embedded system. The approach seeks to identify leaks of critical information at all multiple abstraction levels, from the software algorithms down to the microarchitecture. A cycle-accurate cosimulation platform is presented which supports the validation process. “Validating Families of Latency Insensitive Protocols” by Syed Suhaib, Deepak Mathaikutty, David Berner, and Sandeep Shukla presents both simulation-based and formal techniques to validate system correctness with respect to latency. The difficulty of transmitting a low-skew clock over an entire chip has motivated the development of latency insensitive protocols which enable asynchronous communication. This paper presents techniques to verify that a latency insensitive system is “latency equivalent” to a completely synchronous version of the system. IEEE TRANSACTIONS ON COMPUTERS, VOL. 55, NO. 11, NOVEMBER 2006 1313