Landsat thermal data are employed to derive lake and sea surface temperatures. The limitations of this approach are obvious, since the calculation of surface temperatures based solely on image data requires at least two thermal bands to compensate the atmospheric influence which is mainly caused by water vapour absorption. However, the 1 km spatial resolution of currently available multi‐band thermal satellite sensors (NOAA‐AVHRR, MODIS) is often not appropriate for lake and coastal zone applications. Therefore, it is worthwhile investigating the accuracy which can be obtained with single‐band thermal data using radiosonde information of the atmospheric water vapour column from meteorological stations in the study area. In addition, standard atmospheres from the MODTRAN code were considered that are based on seasonal climatologic values of water vapour, e.g. mid‐latitude summer, mid‐latitude winter, etc. The study area of this investigation comprises various lakes and coastal zones of the Baltic Sea in NE Germany. Landsat‐7 ETM+ imagery of nine acquisition dates was selected covering the time span from February to November 2000. Results of derived lake and sea surface temperatures were compared with in situ measurements and with an empirical model of the Deutscher Wetterdienst (Germany's National Meteorological Service, DWD). RMS deviations of 1.4 K were obtained for the satellite‐derived lake surface temperatures with respect to in situ measurements and 2.2 K with respect to the empirical DWD model. RMS deviations of 1.6 K were obtained with respect to in situ bulk temperatures in coastal zones of the Baltic Sea. This level of agreement can be considered as satisfactory given the principal constraints of this approach. A better accuracy can only be obtained with high spatial resolution (<100 m) multi‐band thermal instruments delivering imagery on an operational basis.
Read full abstract