AbstractTo fulfill the evolving observational needs of the National Weather Service (NWS), future weather radar systems will have to meet demanding requirements. Designing such systems will likely involve trade-offs between system cost and operational performance. A potential cost driver for future weather radars that could cause significant data-quality impacts on forecasters is the required angular resolution and sidelobe performance, which are mainly dictated by the antenna radiation pattern. Typical antenna radiation patterns can be characterized by the width of the main lobe and their sidelobe levels, which are traditionally measured across the azimuthal and elevation dimensions. In this work, we study the impact of increasing sidelobe levels on NWS forecasters’ data interpretation during warning operations. The resulting impact model can be used by decision-makers to better understand the cost–benefit trade-offs inherent in any radar system design.
Read full abstract