The necesity and consequences of modeling driver and load nonlinearity in on-chip global interconnect noise verification
Abstract
The verification of noise in an-chip global interconnect is performed through Simulation of an electrical circuit comprised of a network of coupled transmission lines, terminated by appropriate models for drivers (transmitters) and loads (receivers). The current methodology utilizes linearid models of the terminations, thus requiring only linear circuit simulations. In this study, we show that while a linear noise analysis methodology that relies on the termination model linearization is very efficient and convenient, it may result in significant loss of accuracy and/or in excessively conservative designs. We identdy the situations where modeling the nonlinearity of the termination becomes a determining factor in the accuracy of the analysis. We also study the implications of adopting a fully nonlinear analysis methodology, and propme a practical compromise.