Abstract

This paper discusses a novel substrate modeling technique for the simulation of substrate noise in mixed signal VLSI systems. This model yields to easy merger with the SPICE simulation netlist for the complete pre and post layout estimation of substrate noise effects in large mixed signal VLSI chips. Compared to previous numerous efforts in substrate noise modeling ranging from finite element methods (FEM) to boundary elements methods (BEM), this model, based on a finite sheet resistor slicing scheme also incorporates the effect of supply rail bounce due to bonding wire inductances, and, provides realistic estimates of substrate noise effects with a high degree of computational efficiency. Substrate noise simulations were done using a 0.18 μm TSMC CMOS process technology using typical process parameters. A differential switched capacitor sample and hold circuit and a linear differential transconductor stage was used for the performance evaluation of this novel substrate model. Simulation results indicate a typical increase in Total Harmonic Distortion (THD) by atleast 6 dB due to the substrate noise effects, which corresponds to a performance loss by around 1-b precision. Also, the substrate noise effects are found to be proportional to the oversampling ratio (i.e., the digital clocking rate with respect to the input signal) and the net number of logic transitions at each register transfer instance in the mixed signal chip.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call