Abstract

In this paper, a new look-up table (LUT) method is proposed to reduce the simulation time and the run time memory requirement for large logic and mixed signal simulations. In the proposed method, for the first time, circuit with multiple devices is replaced by one LUT model, called circuit LUT . The replacement results in significant reduction of the run time memory requirement. The replacement also reduces the number of interpolation steps to be performed at every Newton–Raphson iteration during the simulation that results in significant reduction of simulation time. With the proposed method, the simulation speed is improved by two times over the conventional LUT models developed for devices. In addition, 25% reduction in the run time memory requirement is also achieved by the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call