Full precision Crame?r-Rao lower bound (CRLB) where no quantization is assumed is often employed to evaluate and compare distributed estimation performance even though the sensor observations are quantized before any further processing. However, as it completely disregards quantization and often does not exist when the sensor observation noise is bounded, full precision CRLB is often too optimistic or not applicable. In this work, we determine the performance limit of a distributed estimation system with identical one-bit quantizers in terms of the metric minimax CRLB. The performance limit that a distributed estimation scheme with identical quantizers can achieve is found as well as the set of optimal noise distribution functions and quantizers. Compared to the full precision CRLB, the performance limit is shown to be a much tighter bound when the parameter range is relatively large and reveals the important role of the quantization system.