Abstract
AbstractTo fulfill the evolving observational needs of the National Weather Service (NWS), future weather radar systems will have to meet demanding requirements. Designing such systems will likely involve trade-offs between system cost and operational performance. A potential cost driver for future weather radars that could cause significant data-quality impacts on forecasters is the required angular resolution and sidelobe performance, which are mainly dictated by the antenna radiation pattern. Typical antenna radiation patterns can be characterized by the width of the main lobe and their sidelobe levels, which are traditionally measured across the azimuthal and elevation dimensions. In this work, we study the impact of increasing sidelobe levels on NWS forecasters’ data interpretation during warning operations. The resulting impact model can be used by decision-makers to better understand the cost–benefit trade-offs inherent in any radar system design.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.