We report a theoretical study of the phase diffusion in a gain-switched single-mode semiconductor laser. We use stochastic rate equations for the electric field to analyze the phase statistics of the gain-switched laser. Their use avoids the instabilities obtained with rate equations for the photon number and optical phase when the photon number is small. However, we show that a new problem appears when the field equations are integrated: the variance of the optical phase becomes divergent. This divergence cannot be observed with the numerical integration of the commonly used equations for the photon number and optical phase because of the previous instabilities. The divergence of the phase variance means that this quantity does not reach a fixed value as the integration time step is decreased. We find that the phase variance increases as the integration time step decreases, with no sign of saturation behavior even for tiny steps. We explain the divergence by making the analogy of our problem with two-dimensional Brownian motion. The fact that the divergence appears is not surprising because in 1940 Paul L\`evy demonstrated that the variance of the polar angle in two-dimensional Brownian motion is a divergent quantity. Our results show that stochastic rate equations for the photon number and phase are not appropriate for describing the phase statistics when the photon number is small. Simulation of the stochastic rate equations for the electric field are consistent with L\`evy's results but gives unphysical results since an infinite value is obtained for a quantity that can be measured.