Ontogenic changes in soybean radiation use efficiency (RUE) have been attributed to variation in specific leaf nitrogen (SLN) based only on data collected during seed filling. We evaluated this hypothesis using data on leaf area, absorbed radiation (ARAD), aboveground dry matter (ADM), and plant nitrogen (N) concentration collected during the entire crop season from seven field experiments conducted in a stress-free environment. Each experiment included a full-N treatment that received ample N fertilizer and a zero-N treatment that relied on N fixation and soil N mineralization. We estimated RUE based on changes in ADM between sampling times and associated ARAD, accounting for changes in biomass composition. The RUE and SLN exhibited different seasonal patterns: a bell-shaped pattern with a peak around the beginning of seed filling, and a convex pattern followed by an abrupt decline during late seed filling, respectively. Changes in SLN explained the decline in RUE during seed filling but failed to predict changes in RUE in earlier stages and underestimated the maximum RUE observed during pod setting. Comparison between observed and simulated RUE using a process-based crop simulation model revealed similar discrepancies. The decoupling between RUE and SLN during early crop stages suggests that leaf N is above that needed to maximize crop growth but may play a role in storing N that can be used in later reproductive stages to meet the large seed N demand associated with high-yielding crops.
Read full abstract