We present a detailed study of an estimator of the HI column density, based on a combination of HI 21cm absorption and HI 21cm emission spectroscopy. This "isothermal" estimate is given by $N_{\rm HI,ISO} = 1.823 \times 10^{18} \int \left[ \tau_{\rm tot} \times {\rm T_B} \right] / \left[ 1 - e^{-\tau_{\rm tot}} \right] {\rm dV}$, where $\tau_{\rm tot}$ is the total HI 21cm optical depth along the sightline and ${\rm T_B}$ is the measured brightness temperature. We have used a Monte Carlo simulation to quantify the accuracy of the isothermal estimate by comparing the derived $N_{\rm HI,ISO}$ with the true HI column density $N_{\rm HI}$. The simulation was carried out for a wide range of sightlines, including gas in different temperature phases and random locations along the path. We find that the results are statistically insensitive to the assumed gas temperature distribution and the positions of different phases along the line of sight. The median value of the ratio of the true H{\sc i} column density to the isothermal estimate, $N_{\rm HI}/{N_{\rm HI, ISO}}$, is within a factor of 2 of unity while the 68.2% confidence intervals are within a factor of $\approx 3$ of unity, out to high HI column densities, $\le 5 \times 10^{23}$\,cm$^{-2}$ per 1 km s$^{-1}$ channel, and high total optical depths, $\le 1000$. The isothermal estimator thus provides a significantly better measure of the HI column density than other methods, within a factor of a few of the true value even at the highest columns, and should allow us to directly probe the existence of high HI column density gas in the Milky Way.