Abstract

Field experiments were conducted from 1996 to 2000 near Manhattan, KS, to determine the effects of application timing on atrazine loss in surface water runoff. In addition, Groundwater Loading Effects of Agricultural Management Systems (GLEAMS) was run to compare simulated loss with actual loss in the field. Atrazine treatments were fall plus preemergence (FALL + PRE), early preplant plus PRE (EPP + PRE), PRE at a low rate (PRE-LOW), and PRE at a full (recommended) rate (PRE-FULL). Ridge-till furrows served as mini watersheds for the collection of surface water runoff. Water runoff volumes and herbicide concentrations were determined for each runoff event. Across four sampling years, mean atrazine runoff loss was 1.7, 4.3, and 1.7% of applied for FALL + PRE, EPP + PRE, and the mean of the PRE treatments, respectively. Thus, actual average losses from FALL + PRE and EPP + PRE treatments were somewhat higher than that predicted by GLEAMS. For PRE treatments, actual average losses were significantly lower than that predicted by GLEAMS, with measured losses falling below the bottom of the graph in 3 of 4 yr. These findings suggest that in certain parts of the Great Plains, FALL + PRE split applications of atrazine offer acceptably low atrazine runoff loss potential; EPP + PRE is more vulnerable to loss than FALL + PRE; and the GLEAMS model may overestimate atrazine runoff potential for PRE applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.