Abstract

The scaling of on-chip interconnect dimensions and high operating frequencies produce transient crosstalk between coupled interconnect lines. Because of this reason, the estimation of propagation delay and crosstalk noise becomes a critical issue. In this paper, an accurate analytical model is developed using the Finite Difference Time Domain (FDTD) method for CMOS gate driven coupled RLC interconnect line. The model is compared against HSPICE simulations and it is shown that both transient waveforms are matched closely and the average is within 7% for crosstalk and delay estimation. Traditionally, in crosstalk noise modeling, the CMOS driver is modeled as a linear resistor. It is observed that during the transition time transistor operates in linear region as well as saturation region. The percentage of time in saturation region is about 50%. Thus assuming that transistor operates in linear region during the input transition leads to severe errors in noise modeling. Kaushik [B. K. Kaushik and Sankar Sarkar, IEEE Tran. 27, 1150–1154, 2008] proposed a model for crosstalk analysis by considering the non-linear effects of CMOS driver, but this model is limited for coupled two lines.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call