Abstract

Testing FPGA-based soft processor cores requires a completely different methodology in comparison to standard processors. The stuck-at fault model is insufficient, as the logic is implemented by lookup tables (LUTs) in FPGA, and this SRAM-based LUT memory is vulnerable to single-event upset (SEU) mainly caused by cosmic radiations. Consequently, in this paper, we used combined SEU-induced and stuck-at fault models to simulate every possible fault. The test program written in an assembler was based on the bijective property. Furthermore, the fault detection matrix was determined, and this matrix describes the detectability of every fault by every test vector. The major novelty of this paper is the optimal reduction in the number of required test vectors in such a way that fault coverage is not reduced. Furthermore, this paper also studied the optimal selection of test vectors when only 95% maximal fault coverage is acceptable; in such a case, only three test vectors are required. Further, local and global test vector selection is also described.

Highlights

  • FPGA-based applications usually include a higher number of processor cores implemented in real-time microcontrollers, as application processors

  • The most important novelty introduced hereby is a different model of injected faults. This model differs significantly from the conventional stuck-at models widely used for testing processors/microcontrollers implemented in ASIC/embedded platforms because an single-event upset (SEU)-induced fault affects the logic elements implemented by the lookup tables (LUTs) in this manner, meaning that the logic function is arbitrarily changed, as described in detail in Section 2.2 about fault modeling and injections

  • Testing combined with readback or triple modular redundancy (TMR) might still be a good solution

Read more

Summary

Introduction

FPGA-based applications usually include a higher number of processor cores implemented in real-time microcontrollers, as application processors. To test such cores, random-based testing methodologies are mainly proposed, which require a huge number of testing vectors, advanced optimization algorithms, and FPGA resources for their implementation. For such reasons, methods of test vector compression have been developed in order to save memory resources to store them [1]. The pseudo-random test-pattern generators proposed in the bibliography are often realized as feedback-controlled Such methods based on coverage analysis are called coverage-driven verification (CDV). Application of GA automated generation of stimuli based on the source code of a specific software application was presented in [9] In this approach, only processor hardware resources utilized by a verified application are taken into consideration.

Bijective Test Program
Refinements to Achieve Full Bijectivity
Comparison of Results
Fault Modeling
Optimal Reduction in Test Vectors
Further Reduction in Test Vector Number
Local Test Vectors
Configuration Readback
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.