This paper is concerned with the optimal transmission of a non-Gaussian signal (a non-Gaussian message) through a channel with Gaussian white noise. The coding is assumed to be linear in the signal, but nonlinear in the observation (the channel output). Also, we assume that the signal is a square integrable process which is independent of the noise. It is shown that the optimal transmission which maximizes the mutual information between the signal and the observation is achieved by a coding scheme which generates the least-squares estimation error for the signal multiplied by a deterministic coefficient. Namely, it is shown that, for a fixed deterministic coefficient, the estimation error process has the same mutual information as the original signal, and when it is sent, the mean power of the signal is minimized over the codings with the same mutual information. On the other hand, the mutual information is maximized by taking the coefficient in such a way that the mean power of the encoded signal takes the maximum admissible value. It is seen that the well-known result for Gaussian signals is also valid for square integrable non-Gaussian signals.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">></ETX>