Abstract

Fabrication process variations are a major source of yield degradation in the nanoscale design of integrated circuits (ICs), microelectromechanical systems (MEMSs), and photonic circuits. Stochastic spectral methods are a promising technique to quantify the uncertainties caused by process variations. Despite their superior efficiency over Monte Carlo for many design cases, stochastic spectral methods suffer from the curse of dimensionality, i.e., their computational cost grows very fast as the number of random parameters increases. In order to solve this challenging problem, this paper presents a high-dimensional uncertainty quantification algorithm from a big data perspective. Specifically, we show that the huge number of (e.g., $1.5 \times 10^{27}$ ) simulation samples in standard stochastic collocation can be reduced to a very small one (e.g., 500) by exploiting some hidden structures of a high-dimensional data array. This idea is formulated as a tensor recovery problem with sparse and low-rank constraints, and it is solved with an alternating minimization approach. The numerical results show that our approach can efficiently simulate some IC, MEMS, and photonic problems with over 50 independent random parameters, whereas the traditional algorithm can only deal with a small number of random parameters.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call