Abstract
When designing a system in the behavioral level, one of the most important steps to be taken is verifying its functionality before it is released to the logic/PD design phase. One may consider behavioral models as oracles in industries to test against when the final chip is produced. In this work, we use branch coverage as a measure for the quality of verifying/testing behavioral models. Minimum effort for achieving a given quality level can be realized by using the proposed stopping rule. The stopping rule guides the process to switch to a different testing strategy using different types of patterns, i.e. random vs. functional, or using different set of parameters to generate patterns/test cases, when the current strategy is expected not to increase the coverage. We demonstrate the use of the stopping rule on two complex behavioral level VHDL models that were tested for branch coverage with 4 different testing phases. We compare savings of the number of applied testing patterns and quality of testing both with and without using the stopping rule, and show that switching phases at certain points guided by the stopping rule would yield to the same or even better coverage with less number of testing patterns.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.