Abstract

Machine learning-assisted global optimization methods for speeding up analog integrated circuit sizing is attracting much attention. However, often a few typical analog integrated circuit design specifications are considered in most relevant research. When considering the complete set of specifications, two main challenges are yet to be addressed: 1) the prediction error for some performances may be large and the prediction error is accumulated by many performances. This may mislead the optimization and fail the sizing, especially when the specifications are stringent and 2) the machine learning cost could be high considering the number of specifications, considerably canceling out the time saved. A new method, called efficient surrogate model-assisted sizing method for high-performance analog building blocks (ESSAB), is proposed in this article to address the above challenges. The key innovations include a new candidate design ranking method and a new artificial neural network model construction method for analog circuit performance. Experiments using two amplifiers and a comparator with a complete set of stringent design specifications show the advantages of ESSAB.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call