Abstract

Abstract The mean annual flash density, thunderstorm duration, and flash rates were calculated using 121.7 million cloud-to-ground lightning flashes in the continental United States for the period 1989–96. Florida had flash densities over 11 flashes km−2 yr−1, while the Midwest, Oklahoma, Texas, and the Gulf Coast had densities greater than 7 flashes km−2 yr−1. There was a relative minimum in flash density (three flashes km−2 yr−1) in the Appalachian Mountains and Missouri. Thunderstorm duration values exceeded 120 h yr−1 in Florida and 105 h yr−1 in New Mexico, Arizona, and the Gulf Coast. The maximum annual flash rates exceeded 45 flashes h−1 in the Midwest, along the Florida coasts, and along the mid-Atlantic coast with the minimum flash rates, 15 flashes h−1, over the Appalachian and Rocky Mountains. The relationship between thunderstorm duration and flash density is Flash_Density = 0.024(Flash_Hours)1.29 producing expected flash densities that are within 30% of the measured densities for over 70% of ...

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.