The maximum entropy noise under a lag p autocorrelation constraint is known by Burg's theorem to be the pth order Gauss-Markov process satisfying these constraints. The question is, what is the worst additive noise for a communication channel given these constraints? Is it the maximum entropy noise? The problem becomes one of extremizing the mutual information over all noise processes with covariances satisfying the correlation constraints R/sub 0/,..., R/sub p/. For high signal powers, the worst additive noise is Gauss-Markov of order p as expected. But for low powers, the worst additive noise is Gaussian with a covariance matrix in a convex set which depends on the signal power.