Abstract
Learning-based presimulation (i.e., layout-to-fabrication) models have been proposed to predict the fabrication-induced shape deformation from an IC layout to its fabricated circuit. Such models are usually driven by pairwise learning, involving a training set of layout patterns and their reference shape images after fabrication. However, it is expensive and time consuming to collect the reference shape images of all layout clips for model training and updating. To address the problem, we propose a deep-learning-based layout novelty detection scheme to identify novel (unseen) layout patterns, which cannot be well predicted by a pretrained presimulation model. We devise a global–local novelty scoring mechanism to assess the potential novelty of a layout by exploiting two subnetworks: 1) an autoencoder and 2) a pretrained presimulation model. The former characterizes the global structural dissimilarity between a given layout and training samples, whereas the latter extracts a latent code representing the fabrication-induced local deformation. By integrating the global dissimilarity with the local deformation boosted by a self-attention mechanism, our model can accurately detect novelties without the ground-truth circuit shapes of test samples. Based on the detected novelties, we further propose two active-learning strategies to sample a reduced amount of representative layouts most worthy to be fabricated for acquiring their ground-truth circuit shapes. Experimental results demonstrate: 1) the effectiveness of our layout novelty detection algorithm and 2) the ability of our active-learning strategies in selecting representative novel layouts for keeping a learning-based presimulation model updated.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.