Land surface temperature (LST) is an important parameter for measuring the water–heat balance on the Earth's surface. Remote sensing technology provides a unique way to monitor LSTs over large spatial areas. Thermal infrared (TIR)-derived LST has a high spatial resolution and accuracy, but missing values caused by clouds hinder applications of this method. Passive microwave radiation can penetrate clouds, but the data have a relatively lower spatial resolution and accuracy. The Bayesian maximum entropy (BME) method was used for blending the Moderate Resolution Imaging Spectroradiometer (MODIS) LST and Advanced Microwave Scanning Radiometer-Earth Observing System (AMSR-E) LST to produce a spatially complete, accurate, and high-spatial-resolution LST over the Tibetan Plateau (TP) and the Heihe River Basin (HRB). The accuracy of the BME method was validated using the adjusted soil temperature collected from the two verification regions. The root-mean-square errors (RMSEs) are less than 3.54 and 4.89 K over the relatively flat verification region during nighttime and daytime, respectively. In another rugged terrain verification region, the RMSE is less than 3 K during nighttime and ranges from 4.2 to 8.29 K during daytime. The RMSE of the blended LST is significantly better than that of a recent study of the AMSR-E LST retrieved during nighttime, and the RMSE is comparable to the AMSR-E LST retrieved during daytime. The BME method was adopted to integrate the MODIS LST and AMSR-E LST acquired on November 30, 2010 over the TP and HRB. The spatial completeness of the blended LST reached 100%, and the blended LST spatial pattern was generally consistent with the spatial patterns of the MODIS LST and AMSR-E LST. This paper demonstrates the utility of generating all-weather regional LSTs using the BME method from TIR LST and microwave LST products.