Abstract
Abstract Data from geosynchronous Earth-orbiting (GEO) satellites equipped with visible (VIS) and infrared (IR) scanners are commonly used in rain retrieval algorithms. These algorithms benefit from the high spatial and temporal resolution of GEO observations, either in stand-alone mode or in combination with higher-quality but less frequent microwave observations from low Earth-orbiting (LEO) satellites. In this paper, a neural network–based framework is presented to evaluate the utility of multispectral information in improving rain/no-rain (R/NR) detection. The algorithm uses the powerful classification features of the self-organizing feature map (SOFM), along with probability matching techniques to map single- or multispectral input space into R/NR maps. The framework was tested and validated using the 31 possible combinations of the five Geostationary Operational Environmental Satellite 12 (GOES-12) channels. An algorithm training and validation study was conducted over the conterminous United States during June–August 2006. The results indicate that during daytime, the visible channel (0.65 μm) can yield significant improvements in R/NR detection capabilities, especially when combined with any of the other four GOES-12 channels. Similarly, for nighttime detection the combination of two IR channels—particularly channels 3 (6.5 μm) and 4 (10.7 μm)—resulted in significant performance gain over any single IR channel. In both cases, however, using more than two channels resulted only in marginal improvements over two-channel combinations. Detailed examination of event-based images indicate that the proposed algorithm is capable of extracting information useful to screen no-rain pixels associated with cold, thin clouds and identifying rain areas under warm but rainy clouds. Both cases have been problematic areas for IR-only algorithms.
Highlights
Significant advances in rainfall estimation from satellite observations have been achieved in recent years
E-mail: abehrang@uci.edu satisfying the demand for high accuracy at the scales relevant to such applications. Both geosynchronous Earth-orbiting (GEO) satellites equipped with visible (VIS) and infrared (IR) scanners and low earth orbiting (LEO) satellites equipped with passive microwave (PMW) sensors provide observations that are commonly used for rainfall retrieval
We presented an algorithm that allows the utilization of multiple channels in delineating R/NR areas
Summary
Significant advances in rainfall estimation from satellite observations have been achieved in recent years. Algorithms, and processing power, satellite-based precipitation estimates are moving toward increasingly finer spatial and temporal resolutions This provides unprecedented opportunities for new hydrological and meteorological applications, it brings about an additional challenge of. To the thermal IR (;11 mm) channel, the VIS channel, which provides indirect measure of cloud thickness, is the second most commonly used band in GEObased precipitation retrieval algorithms Techniques that use both infrared and visible images to delineate rain and no-rain areas go back to the 1970s. Kurino (1997) reported that image pixels where BTD(11mm,12mm) are greater than or equal to 3 K correspond to cirrus clouds with no rain, while areas whose BTD(11mm,6.7mm) are less than or equal to 0 K correspond to deep convective cloud with heavy rain Using these three channels (11, 12, and 6.7 mm) along with composite digital radar data, he calculated threedimensional (3D) lookup tables of probability of rain and mean rain rate to estimate both ‘‘deep/shallow’’ precipitation rates. Selecting four parameters—the radiance ratio of 0.6 and 1.6 mm, BTD(11mm,12mm), BTD(3.8mm,11mm), and Tb11—they suggested a number of thresholds for delineating rain areas and demonstrated the superiority of using multispectral information
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.