The one-bit quanta image sensor (QIS) is a photon-counting device that produces binary measurements where each bit represents the presence or absence of a photon. The sensor quantizes the analog voltage into the binary bits using a threshold value <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">${q}$ </tex-math></inline-formula> . The average number of ones in the bitstream is known as the bit density and is the sufficient statistics for signal estimation. An intriguing phenomenon is observed when the quanta exposure is at the unity and the threshold is <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">${q} = {0.5}$ </tex-math></inline-formula> . The bit density demonstrates an insensitivity as long as the read noise level does not exceed a certain limit. In other words, the bit density stays at a constant independent of the amount of read noise. This article provides a mathematical explanation of the phenomenon by deriving conditions under which the phenomenon happens. It was found that the insensitivity holds when some forms of the symmetry of the underlying Poisson–Gaussian distribution hold.