Searches for continuous-wave gravitational radiation in data collected by modern long-baseline interferometers, such as the Laser Interferometer Gravitational-wave Observatory (LIGO), the Virgo interferometer, and the Kamioka Gravitational Wave Detector, can be memory intensive. A digitization scheme is described that reduces the 64-bit interferometer output to a one- or two-bit data stream while minimizing distortion and achieving considerable reduction in storage and input/output cost. For the representative example of the coherent, maximum-likelihood matched filter known as the F -statistic, it is found using Monte Carlo simulations that the injected signal only needs to be ≈24% stronger (for one-bit data) and ≈6.4% stronger (for two-bit data with optimal thresholds) than a 64-bit signal in order to be detected with 90% probability in Gaussian noise. The foregoing percentages do not change significantly when the signal frequency decreases secularly, or when the noise statistics are not Gaussian, as verified with LIGO Science Run 6 data.