Abstract

Various methods of measuring the dielectric integrity of gate oxide to qualify wafer fabrication foundries have developed over time. Test durations range from seconds to hours depending on the conditions used and applied. There is a view that the longer the test duration the more accurate the reliability prediction. The cost and delays associated with the lengthy package level TDDB (time dependent dielectric breakdown) test have resulted in it being applied less often and being replaced by wafer level tests. This paper evaluates two methods of wafer level tests, namely: ramped Q/sub BD/ testing applying a fixed initial current (as generally used in the production environment); ramped Q/sub BD/ testing applying a fixed initial current density (more commonly reported in the literature). The above tests are compared to the package level constant voltage TDDB test. This paper provides a thorough investigation into the oxide area dependency for both Q/sub BD/ and TDDB tests. This paper also investigates the potential correlation between both the Q/sub BD/ and TDDB tests. Finally, these correlations are used to implement on-line control equations that can be utilised by manufacturing to quickly compare their Q/sub BD/ results to required product lifetimes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call