As the number of satellite networks increases, the radio spectrum is becoming more congested, prompting the need to explore higher frequencies. However, it is more difficult to operate at higher frequencies due to severe impairments caused by varying atmospheric conditions. Hence, radio channel forecasting is crucial for operators to adjust and maintain the link’s quality. This paper presents a practical approach for Q/V-band modeling for low Earth orbit satellite channels based on tools from machine learning and statistical modeling. The developed Q/V-band LEO satellite channel model is composed of; (i) forecasting method using model-based deep learning, intended for real-time operation of satellite terminals, and (ii) statistical channel simulator that generates a time-series path-loss random process, intended for system design and research. Both approaches capitalize on real-measurements obtained from Al-phaSat’s Q/V-band transmitter at different geographic latitudes. The results show that model-based deep learning can outperform simple statistical and deep learning methods by at least 50%. Moreover, the model is capable of incorporating varying rain and elevation angle profiles.