Satellite altimetry is currently one of the most widely used techniques for monitoring lake water levels. Raw altimetry measurements are generally delivered in an ellipsoid height system, while lake heights are finally required in a local level height system to ensure the lake surface being strictly flat under the physical constraints of both gravity and non-gravity factors. Therefore, there is always a need for height system transformation of altimetry data. Many of the previous studies have used orthometric heights in an approximate manner in the altimetric data processing. The geoid errors, together with other error sources, can introduce significant biases into water level estimations. In this paper, we propose a spatio-temporal regression method to calculate orthometric height to level height (O2L) differences, which can be used to yield accurate lake levels from satellite altimetry data. In the experiments conducted in this study, we chose ICESat-2 data to train the O2L bias correction model, and evaluated the algorithm performance on two large lakes in central Asian (Issyk-Kul and Baikal lakes) and one lake in North America (Lake Michigan). The experimental results indicate that the spatio-temporal modeling method performs better than the traditional spatial modeling method in terms of regression accuracy, and the water levels derived from the new method are much closer to in-situ gauging measurements. Influenced by O2L bias, there is often a jitter phenomenon in the time series of lake levels, and the results of different satellites are, accordingly, not mutually consistent. A visual analysis reveals that the spatio-temporal modeling method can greatly reduce the high-frequency bias.