SummaryTropospheric attenuations can be significant in the millimeter wave (mmWave) frequency bands; hence, accurate prediction modeling of tropospheric attenuation is important for reliable mmWave communication. Several models have been established by the International Telecommunication Union (ITU), yet estimation accuracy is limited due to the large spatial scales used for model input parameters. In this paper, we address this and apply local precipitation data to analyze tropospheric attenuation statistics and compare to results when using ITU regional input rain data. Specifically, tropospheric attenuation is predicted via simulations using the ITU method at 30, 60, and 90 GHz in four distinct geographic locations with different climate types. From our simulations, we gather statistics for annual average rain attenuation, worst month rain attenuation, and rain attenuation per decade. Our results indicate that when using local measured rain data, for 1 km link distance, mean rain event attenuation increases from 0.5 to 2 dB. Local rain data yield larger attenuations at essentially all percentages of time not exceeded (essentially corresponding to all probability values): for example, for 0.1% of time not exceeded, in Columbia, SC, rain attenuation for 30 GHz frequency increases to 9 dB with local rain data, compared to 5 dB with ITU's regional data, corresponding to rain rates of 38.2 and 17.5 mm/h, respectively; at the same probability and location, the 90 GHz attenuation increases by 10 dB, from 10 to 20 dB when local rain data are used. Fog attenuations are also appreciable, reaching 8 dB for the 90 GHz frequency. Moreover, for the example locations, peak rain attenuations have increased at a rate of approximately 2 dB/decade over the past 50 years. Our results indicate that actual tropospheric attenuations may be substantially larger than that predicted by the ITU model when using regional rain rate data.