Abstract

In lithography, optical proximity and process bias/effects need to be corrected to achieve the best wafer print. Efforts to correct for these effects started with a simple bias, adding a hammer head in line-ends to prevent line-end shortening. This first-generation correction was called rule-based optical proximity correction (OPC). Then, as chip feature sizes continued to shrink, OPC became more complicated and evolved to a model-based approach. Some extra patterns were added to masks, to improve the wafer process window, a measure of resilience to manufacturing variation. Around this time, the concept of inverse lithography technology (ILT), a mathematically rigorous inverse approach that determines the mask shapes that will produce the desired on-wafer results, was introduced. ILT has been explored and developed over the last three decades as the next generation of OPC, promising a solution to several challenges of advanced-node lithography, whether optical or extreme ultraviolet (EUV). Today, both OPC and ILT are part of an arsenal of lithography technologies called resolution enhancement technologies. Since OPC and ILT both involve computation, they are also considered as part of computational lithography. We explore the background and history of ILT and detail the significant milestones that have taken full-chip ILT from an academic concept to a practical production reality.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call