Volume 55, Issue 5 p. 3941-3959
Research Article
Free Access

A Race Against Time: Modeling Time Lags in Watershed Response

Idhayachandhiran Ilampooranan

Idhayachandhiran Ilampooranan

Department of Civil and Environmental Engineering, University of Waterloo, Waterloo, Ontario, Canada

Search for more papers by this author
K. J. Van Meter

K. J. Van Meter

Department of Earth and Environmental Sciences, University of Waterloo, Waterloo, Ontario, Canada

Search for more papers by this author
Nandita B. Basu

Corresponding Author

Nandita B. Basu

Department of Civil and Environmental Engineering, University of Waterloo, Waterloo, Ontario, Canada

Department of Earth and Environmental Sciences, University of Waterloo, Waterloo, Ontario, Canada

Water Institute, University of Waterloo, Waterloo, Ontario, Canada

Correspondence to: N. B. Basu,

[email protected]

Search for more papers by this author
First published: 22 March 2019
Citations: 44


Land use change and agricultural intensification have increased food production but at the cost of polluting surface and groundwater. Best management practices implemented to improve water quality have met with limited success. Such lack of success is increasingly attributed to legacy nutrient stores in the subsurface that may act as sources after reduction of external inputs. However, current water-quality models lack a framework to capture these legacy effects. Here we have modified the SWAT (Soil Water Assessment Tool) model to capture the effects of nitrogen (N) legacies on water quality under multiple land-management scenarios. Our new SWAT-LAG model includes (1) a modified carbon-nitrogen cycling module to capture the dynamics of soil N accumulation, and (2) a groundwater travel time distribution module to capture a range of subsurface travel times. Using a 502-km2 Iowa watershed as a case study, we found that between 1950 and 2016, 25% of the total watershed N surplus (N Deposition + Fertilizer + Manure + N Fixation − Crop N uptake) had accumulated within the root zone, 14% had accumulated in groundwater, while 27% was lost as riverine output, and 34% was denitrified. In future scenarios, a 100% reduction in fertilizer application led to a 79% reduction in stream N load, but the SWAT-LAG results suggest that it would take 84 years to achieve this reduction, in contrast to the 2 years predicted in the original SWAT model. The framework proposed here constitutes a first step toward modifying a widely used modeling approach to assess the effects of legacy N on the time required to achieve water-quality goals.

Key Points

  • A novel approach was developed to incorporate nitrate lag times in commonly used water-quality model for watershed management
  • SWAT-LAG model showed that lag times to achieve Nutrient Task Force's recommended 60% nitrate load reduction can vary from 6 to 80 years
  • Greater implementation of new management practices lead to shorter lag times to achieving water-quality goals

Plain Language Summary

For nearly a century, we have used nitrogen fertilizers to boost crop yields. However, the environmental effects of fertilizer use have been severe. Drinking water with high nitrate levels threatens human health, and high nitrogen loads in rivers lead to the creation of dead zones in coastal waters that make it impossible for fish or underwater plants to survive. Although we have tried for decades to reduce nitrogen levels in our waterways, the results have been disappointing. Scientists now believe that improvements may be slow to come because there are large amounts of nitrogen that have accumulated in soil and groundwater—legacy nitrogen—that continue to pollute our rivers even after farmers have reduced fertilizer use or improved management. However, policymakers still struggle to predict how long it will take to improve water quality. In our work, we have created a new model, Soil Water Assessment Tool-LAG, that allows us to predict the time lags caused by legacy nitrogen. Using an agricultural watershed in Iowa as a case study, we show that it can take as long as 80 years to see the full effects of new management practices and that these time lags must be considered when setting policy goals.

1 Introduction

Human modification of the nitrogen (N) cycle has resulted in a twofold increase in the fixation of reactive N compared to preindustrial levels (Galloway et al., 1995). This increase can be primarily attributed to emissions from burning fossil fuels, fertilizer production, and leguminous crop production (Galloway et al., 2004; Vitousek et al., 1997). It is estimated that around 50% of the total inorganic N used thus far has been applied in the last 15 years (Howarth et al., 2002; Townsend et al., 2003), and food produced as a result of inorganic N fertilizer now feeds more than 45% of the world's population (Smil, 2011). High levels of N fertilizer use have significantly perturbed the global N cycle, and it has been argued that planetary boundaries for maintaining human and ecosystem health have been exceeded (Rockström et al., 2009). Increased N flux to coastal and inland waters has accelerated eutrophication, reduced biodiversity through species loss, and significantly reduced the coastal fish catch (Vitousek et al., 1997). The inputs are expected to increase even further in the future to meet the food demands of a growing global population (Vitousek et al., 1997).

Recognition of the detrimental effects of agricultural intensification has led to the adoption of various best management practices (BMPs) for improving water quality. However, these interventions have in many cases not led to expected improvements. For example, attempts have been made to reduce inorganic fertilizer inputs in the United Kingdom since the 1980s; however, no substantial decrease has been observed in riverine N concentrations (Howden et al., 2010). In the Susquehanna River Basin, while fertilizer application rates were constant between 1971 and 2002, the riverine nitrate load continued to increase (Van Meter et al., 2017). A recent study in the Yongan River watershed of China shows that stream N concentrations have been increasing consistently, despite reductions in inorganic fertilizer inputs since 1999 (Chen et al., 2014). Similarly, in a review study, Grimvall et al. (2000) show that despite reductions in fertilizer application since the 1990s, the majority of eastern European rivers have failed to show any reduction in riverine N loads.

We hypothesize that this apparent lack of response in stream nitrate concentrations, despite reductions in N fertilizer inputs, arises due to time lags in catchment response. Time lags, defined as the time between implementation of agricultural BMPs and improvements in stream water quality, are increasingly recognized as an important factor behind the apparent failure of BMPs (Fenton et al., 2011; Meals et al., 2010; Van Meter & Basu, 2015). Meals et al. (2010), in a data synthesis study, found that time lags can range from 5 to more than 50 years and are a function of watershed size, soil type, climate, and management practices. Van Meter and Basu (2017) have estimated watershed lag times to be between 12 and 34 years in the Grand River Watershed in southern Ontario.

Van Meter and Basu (2015) conceptualized time lags in nitrate response as the sum of two components: (a) a hydrologic time lag that arises from accumulation of dissolved nitrate in the vadose zone and groundwater reservoirs, and (b) a biogeochemical time lag that arises due to accumulation of soil organic N in the root zones of agricultural soils. While the existence of hydrologic time lags is well accepted, the recognition of biogeochemical time lags for N is relatively recent (Burt et al., 2010; Chen et al., 2014; Van Meter & Basu, 2015). In a recent study, Van Meter et al. (2016) analyzed soil core data from 61 agricultural sites across Iowa that were sampled in 1959 and again in 2007 and observed a net 14% increase in soil N (1478 ± 547 kg/ha) over a depth of 0 to 100 cm.

Despite recognition that the buildup of nutrient legacies can lead to time lags in watershed response, there has until recently been no modeling framework that can predict these time lags. Most watershed models, like the Soil Water Assessment Tool (SWAT), that are used to predict the impact of BMPs can predict the magnitude of the concentration reduction that might be finally achieved, but not how long it will take to achieve that reduction. For policymakers, however, knowing the time required to meet concentration reduction goals are critical to making informed choices.

Recognizing the need to model time lags, Van Meter and Basu (2015) developed a parsimonious, travel time-based approach to quantify time lags, and this method was further refined by adding a simplified biogeochemistry module in Van Meter et al. (2016). The model thus developed, also referred to as the ELEMeNT model (Exploration of Long-Term Nutrient Trajectories), was able to describe time lags at the outlet of the Mississippi River Basin (MRB, Van Meter et al., 2018). However, while the ELEMeNT model can capture long-term trends in soil and stream N, it is not designed to simulate complex agroecosystem dynamics, including the variations in crop type and nutrient management that are routinely handled in agricultural models like SWAT. Accordingly, the present study has two primary objectives: (1) to modify the existing SWAT model to capture time lags in watershed response and (2) to use the modified model to quantify N time lags under various management scenarios. Specifically, we focused on two aspects of SWAT modification: (i) refining the ability of SWAT to describe N accumulation in soils that leads to the creation of biogeochemical legacy and time lags, and (ii) coupling SWAT with a travel time distribution (TTD) model to capture time lags due to slow groundwater flow pathways.

2 Modeling Framework: Development of SWAT-LAG to Represent Time Lags

We developed a new version of SWAT, SWAT-LAG, by integrating a TTD model into the existing SWAT model. The new SWAT-LAG model accounts for time lags arising from legacy nutrient accumulation in both soil and groundwater reservoirs. In the following sections, we provide background on the SWAT model, the TTD model, and the model coupling.

2.1 SWAT Model: Background and Limitations

SWAT is a watershed-scale, process-based, continuous simulation model that works at a daily time step. In this model, a watershed is divided into sub-basins that are further subdivided into Hydrologic Response Units (HRUs), characterized by unique soil, slope, land use, and management attributes. Each HRU is vertically discretized into three compartments, the soil profile (0–2 m), the shallow aquifer (2–20 m), and the deep aquifer (>20 m) (Narula & Gosain, 2013). Runoff and nutrients generated by each HRU are aggregated at the top of the reach specific to the sub-basin and then routed through that reach toward the watershed outlet.

One of the most critical model components for adequate representation of nutrient fluxes is the carbon-nitrogen (C-N) cycling module within the soil profile. The most commonly used SWAT formulation for C-N cycling (hereafter referred to as SWAT) tracks five different N pools in the soil: two inorganic N pools (ammonia and nitrate) and three organic N pools (fresh, stable, and active). Nitrogen dynamics are driven by water, temperature, soil moisture conditions, and plant uptake. It has been argued that this formulation overlooks several key factors controlling organic carbon dynamics in the soil, including the movement of organic carbon with water and the loss of organic C through soil erosion (Zhang et al., 2013). Recognizing this limitation, Zhang et al. (2013) integrated more rigorous C-N cycling equations into the SWAT framework that are adapted from a more complex field-scale agroecosystem model, the CENTURY model. The modified model hereafter referred to as SWAT-M, is able to accurately capture greenhouse gas emissions from agricultural sites under different soil, climate, and management practices (Zhang et al., 2013).

The other major limitation of the SWAT model is its ability to accurately describe nitrate transport component through the subsurface (Bouraoui et al., 2005; Ekanayake & Davie, 2005). While SWAT considers a lag time in modeling lateral flow and groundwater flow, the magnitude of this lag time is generally of the order of a few days and represents the recession behavior of the hydrograph but not the slow groundwater pathways that have been implicated in longer-term delays in nitrate response (McDonnell & Beven, 2014; Van Meter et al., 2016). Or, in other words, while the parameters GW_DELAY, LAT_TTIME, and SURLAG provide information on the celerity of water in the watershed, they do not describe the velocity of water (McDonnell & Beven, 2014). Thus, while SWAT (default SWAT model) can predict what will be the final streamflow nitrate concentration many years after land use change (after the system has reached a steady state), it provides no information on the time required to achieve that change in concentration, information that is critical for policymakers and managers.

2.2 SWAT-LAG: SWAT Coupled With a TTD Model

To address the issues identified in the above section, we have developed the SWAT-LAG model by coupling the SWAT-M model (Zhang et al., 2013) with a TTD model, which includes simulation of N transport through slower groundwater pathways. The new SWAT-LAG model describes the groundwater mass loading Jb,lagged (t) (M/T) at the watershed outlet as a function of the groundwater load through the baseflow pathway simulated in SWAT-M, Jb(t), and the TTD f(τ) as

The modeled nitrate flux at the watershed outlet J(t) is then calculated as the sum of the nitrate flux through surface flow (Jsurfaceflow), lateral flow (Jlateralflow), tile flow (Jtileflow), and lagged baseflow (Jb,lagged) (estimated from equation 1, Figure 1). The outlet nitrate flux is calibrated against the measured stream nitrate data using the Dynamically Dimensioned Search (DDS) algorithm (Tolson & Shoemaker, 2007), in OSTRICH calibration tool (Matott, 2017). In the present study, the coupling of SWAT and the TTD model was done externally using MATLAB (MATLAB, 2018) where for each calibration run, MATLAB extracted Jb(t) from SWAT-M output files and estimated Jb,lagged(t) and J(t) (Figure S5 in the supporting information). It is important to note that while the default SWAT model parameters like GW_DELAY, LAT_TTIME, and SURLAG describe the surface, lateral flow, and groundwater lags, these lags are in the order of days and describe only the celerity of the water. The lag times used in our model are much longer and vary from 36 days to >50 years. Modifications to SWAT-M source code to capture nitrogen legacies were described in supporting information section S.3 (Zhang et al., 2013).

Details are in the caption following the image
Conceptual modeling framework showing the coupling of SWAT-M with the travel time distribution model to create SWAT-LAG model. SWAT = Soil Water Assessment Tool.

3 Materials and Methods

3.1 Study Site

The South Fork Iowa River Watershed (SFIRW), a 502 km2 predominantly agricultural watershed that is a part of the Iowa-Cedar basin in the Midwestern United States (Figure 2), was used in the present study to evaluate the role of legacy nitrogen on time lags in watershed response. The South Fork is representative of typical watersheds in the Midwestern United States that contribute significant N loads to the Mississippi River and thus was a good candidate for our analysis. The mean annual rainfall over this watershed is 83 cm, and the mean monthly temperature varies between −7.7 and 23.4 °C. The watershed lies in the Des Moines Lobe in Iowa, which is characterized by poorly drained soils with low relief and is drained by subsurface tile drains (Green et al., 2006; Tomer et al., 2008). The dominant soil type is Clarion (fine loam), which makes up 73.8% of the watershed. Land use in the watershed is dominated by row-crop agriculture (86% of the area under corn-soybean rotations), followed by urban (6.59%), pasture (4.09%), and forest (2.47%).

Details are in the caption following the image
Site map showing the location of South Fork Iowa River Watershed (SFIRW) and the current land use (depicting crop rotations) obtained by overlaying 9-year (2004–2012) Crop Data Layers (CDLs) in ArcGIS.

3.2 SWAT Input Data Sets and Model Parameters

Streamflow and nitrate concentration data from 1996 to 2015 were obtained for the gauging station ID 05454300 at the outlet of SFIRW from U. S. Geological Survey (2016). Streamflow data were available at the daily timescale, while N concentration data was available at the monthly frequency. We used the Weighted Regression on Time, Discharge, and Season model to estimate the monthly stream nitrate flux from the sparse data, and these monthly values were used for model calibration and validation (Corsi et al., 2015; Hirsch et al., 2010). Monthly stream nitrate fluxes simulated using Weighted Regression on Time, Discharge, and Season corresponded well with the observed nitrate flux values with a percent bias (PBIAS) of 2%. The watershed has only one weather station for which daily rainfall and temperature data were obtained from the U. S. National Weather Service (2018), United States Environmental Protection Agency (2013), and Iowa Environmental Mesonet (2018) for the period 1950 to 2016. SWAT's weather generator was used to simulate solar radiation, relative humidity, and wind speed. Evapotranspiration was estimated using the Hargreaves Potential Evapotranspiration method (Nair et al., 2011). Soil characteristics were obtained from the SWAT's State Soil Geographic (STATSGO) database. We defined five slope classes as 0–0.5%, 0.5–1.45%, 1.45–2%, 2–3.5%, and >3.5%. The soil, land use, and slope layers were intersected to create 395 unique HRUs with distinct attributes. We used the STATSGO database instead of the finer-resolution Soil Survey Geographic (SSURGO) database since increasing the number of HRUs beyond 395 would significantly increase the computational burden of the model. SWAT models that use the SSURGO database often reduce the number of HRUs by focusing on the most dominant HRUs and neglecting the contribution of the others. In our model, we needed a spatially distributed grid of HRUs to couple with the grid-scale travel times, and thus the dominant HRU option was not feasible. A brief review of studies that compared nitrate flux estimates from STATSGO versus SSURGO inputs revealed mixed conclusions—while Chaplot (2005) observed improved predictability when using SSURGO data, Geza and McCray (2008) and Bhandari et al. (2018) did not observe significant differences between STATSGO and SSURGO predictions.

We assumed that all agricultural HRUs with slopes less than 2% had tiles (Green et al., 2014) since we did not have specific data regarding the locations of tile drains, This assumption led to 58% of the watershed area (67% of cropped area) being tile drained. Depth to subsurface drain (DDRAIN), time to drain soil to field capacity (TDRAIN), and drain tile lag time (GDRAIN) values were set to 1,000 mm, 48 and 96 hr, respectively, based on Green et al. (2006) and Nair et al. (2011). Initial soil N and C concentrations in 1949 were estimated at the HRU scale based on the regression equations developed by Van Meter et al. (2016), in which soil C and N is described as a function of soil type (% sand, silt, and clay). These calculations led to an area-weighted average initial organic N mass estimate of 16,347 kg/ha across the SFIRW, which is close to the range 16,744–17,485 kg/ha estimated by Van Meter et al. (2016).

Annual crop yield data were estimated using the methods outlined in supporting information section S1 (Gassman, 2008; Hong et al., 2013; USDA-Agricultural Census, 2012; USDA-Agricultural Survey, 2012). Mineral fertilizer application rates, estimated using fertilizer sales data (Alexander and Smith, 1990; U. S. Geological Survey [USGS], 2012), varied between 2 and 211 kg·ha−1·year−1 across different years. Estimation of mineral fertilizer application rates was detailed in supporting information section S1 (Alexander & Smith, 1990; Sawyer, 2015; USGS, 2012). Manure application rates were estimated using the animal data in the USDA-Agricultural Census (2012) and nitrogen in animal excretion information from Hong et al. (2011, supporting information section S1). The watershed has a large number of Concentrated Animal Feeding Operation lots (Tomer et al., 2008), and manure produced in these Concentrated Animal Feeding Operations is known to be one of the major sources of fertilizer (McCarthy et al., 2012). The typical manure N application rates varied between 37 and 105 kg·ha−1·year−1. HRUs under corn, oats, and hay received N fertilizer (inorganic and organic), while soybean and alfalfa HRUs were supplied with P2O5 at the rates of 45 and 73 kg·ha−1·year−1, respectively, based on Iowa State University's recommendation guide (Mallarino et al., 2013).

3.3 Estimation of TTD

The TTD, f(τ), was estimated using the geographic information system approach that was proposed by Schilling and Wolter (2007) and Basu et al. (2012) and that has been shown to adequately capture groundwater travel times in Iowa landscapes. In this methodology, the travel time τ corresponding to each point in the landscape is described by
where L is the flow path length from each cell in the landscape to the nearest stream (m), K is the hydraulic conductivity (m/s), i is the hydraulic gradient, and n is the aquifer porosity. The primary assumption of this approach is that the water table follows the topography (Basu et al., 2012; Schilling & Wolter, 2007), and thus the hydraulic gradient can be approximated as the ratio of the elevation difference (as estimated from the digital elevation map) and the flow path distance L between the cell of interest and the nearest outlet.

Using the hydrology toolset in ArcGIS 10.4.1, we filled the 30-m Digital Elevation Model (DEM) (U. S. Geological Survey, 2013) to get the flow direction and flow accumulation raster. The stream network was created by applying a stream initiation threshold of 100 acres on the flow accumulation raster. Using the flow direction and the stream network raster, flow path length (L) (routed value along the flow direction network, toward the stream) was computed for each 30-m cell using the Flow Length tool. The hydraulic gradient was estimated using the DEM and the stream network. Saturated hydraulic conductivity (K) was extracted from SWAT's soil database. Aquifer porosity was assumed to be equal to 0.3, following Schilling and Wolter (2007) and Basu et al. (2012). Average linear velocity was estimated for each 30-m cell in the watershed using K, i, and n as inputs, using the Darcy Velocity tool. Using “L” and average linear velocity rasters, we computed the travel time for each 30-m cell in the watershed. The mean travel time τ of each HRU was obtained by averaging the travel time of all cells inside each HRU.

3.4 Model Calibration, Validation, and Uncertainty Estimation

The modified SWAT model (SWAT-LAG) was calibrated to simulate monthly discharge and nitrate loads, and annual crop yield. Parameters sensitive to hydrology, nitrate flux, and crop yield were selected based on extensive literature review (Baumgart, 2005; Faramarzi et al., 2009; Green et al., 2006; Hu et al., 2007; Jha et al., 2007; Kannan et al., 2007; Le, 2015; Nair et al., 2011) and a one-at-a-time sensitivity analysis that was done manually. Fifty-two parameters were selected for calibration, and suitable upper and lower bounds were fixed based on literature values (Tables 1 and S2). The model was calibrated from 1996 to 2008 and validated from 2009 to 2015 for streamflow and nitrate flux (Figure S2). Crop yields were calibrated from 1950 to 2008 and validated from 2009 to 2015.

Table 1. Hydrology and Nitrate Flux Calibration Variables, Descriptions, Ranges, and Final Calibrated Parameter Values
Variables Description Range Calibrated values
CN2_1 Runoff curve number 1 59–73.7 61.6
CN2_2 Runoff curve number 2 66–82.5 73.9
CN2_3 Runoff curve number 3 69–86.3 85.2
CN2_4 Runoff curve number 4 77–96.3 84.9
CN2_5 Runoff curve number 5 78–97.5 83.1
CN2_6 Runoff curve number 6 86–100 94.6
CN2_7 Runoff curve number 7 92–100 96.4
ESCO Soil evaporation compensation coefficient 0.8–1.0 0.92
DEP_IMP Depth to impervious layer, mm 3,250–3,650 3502
GWQMN Threshold water content in shallow aquifer before groundwater can flow, mm 75–175 104
REVAPMN Threshold depth of water in shallow aquifer for REVAP (movement of water from shallow aquifer into the overlying unsaturated zone) to occur, mm 75–175 105
CNCOEFF Plant ET curve number coefficient 0.1–0.3 0.21
Nitrate flux
HLIFE_NGW Half-life of nitrate in groundwater, day 5–75 71
BIOMIX Biological mixing efficiency 0.18–0.22 0.18
CDN Soil denitrification rate coefficient 0.1–1 0.53
NPERCO Nitrogen percolation coefficient 0.2–1 0.95
  • Note. (i) CN2_1 to CN2_7: The study area has seven different types of curve numbers that were treated as individual calibration parameters in OSTRICH calibration tool, and (ii) crop calibration parameters are provided in Table S2.
We used the OSTRICH calibration tool, a model-independent, flexible optimization program with a multi-objective calibration environment, for model calibration and validation. Multi-objective calibration was performed by assigning seven calibration targets (discharge, nitrate flux, corn yield, soybean yield, oats yield, alfalfa yield, and hay yield) and using the DDS algorithm in OSTRICH (Matott, 2017). The Kling-Gupta efficiency (KGE) and PBIAS were selected to evaluate the calibration statistics. We aggregated the KGE values of all seven calibration targets to obtain the overall objective function by assuming equal weights to all calibration targets. We also constrained the PBIAS of each of the seven calibration targets to be within ±10%. Acceptance criteria for KGE was formulated by synthesizing 11 studies as explained in supporting information section S.2 (Formetta et al., 2014; Hoch et al., 2017; Hublart et al., 2015; Kuentz et al., 2013; Pechlivanidis et al., 2010; Pechlivanidis & Arheimer, 2015; Rajib et al., 2016; Revilla-Romero et al., 2015; Thiemig et al., 2013; Trautmann, 2016; Yang et al., 2016).
where r is the linear coefficient between the simulated and measured time series; α =  urn:x-wiley:00431397:media:wrcr23899:wrcr23899-math-0004; σs and σm are standard deviation of simulated and measured series; and β =  urn:x-wiley:00431397:media:wrcr23899:wrcr23899-math-0005; μs and μm are the mean of simulated and measured time series
where Qm and Qs are the measured and simulated data

We estimated the uncertainty in model predictions as a function of both parameter uncertainty and input uncertainty. Parameter uncertainty was considered by using all the 250 parameter sets estimated by the DDS algorithm. In addition to parameter uncertainty, we also considered input uncertainty due to rainfall, fertilizer application rates, and travel time. Specifically, we varied the rainfall, fertilizer application rates, and travel time magnitudes by ±25% and created 281 scenarios for estimation of input uncertainty. The input uncertainty was considered in conjunction with the parameter uncertainty to estimate the 95% prediction uncertainty (95 Percent Prediction Uncertainty (PPU), Beven & Binley, 1992; Beven & Freer, 2001; Beven, 2006; Abbaspour 2012) at a monthly time step.

3.5 Developing Temporally Varying Land Use Maps (Crop Rotations) in SWAT

As the final set of input parameters, we developed time-varying land use maps for the study area. One of the challenges in simulating a long trajectory of land use change and nitrogen accumulation in SWAT is related to the appropriate representation of land use change and crop rotation patterns over time. Capturing these trajectories is important, as the exact sequence of land use change and rotation impacts legacy accumulation and depletion. As described below, to effectively simulate crop rotations and land use change in SWAT, we used two different methodologies, dependent on data availability across the simulation period. Only agricultural census data were available before 2004, while more detailed crop data layers were available in the later period, as described below.

For the period 1950–2003, we used county (a subdivision of states in the United States) scale information on annual cropped areas from USDA-Agricultural Survey (2012) in conjunction with USDA-Agricultural Census (2012) data to estimate the annual area under each crop (Figure S1). Given the difficulty in reassigning HRUs continuously in time as a function of land use change, we split the time period into two time-blocks (1950–1960, 1961–2003) and maintained the land use and crop rotation constant within each time-block. The time-blocks were chosen based on land use trends, as indicated by the survey data. Specifically, there was a dramatic shift in land use between 1961 and 1964, with large increases in the area under soybean and decreases in the area under oats and hay. Average crop acreages within each time-block were calculated as the mean of the time series of the cropped area within that block (Figure S1). The rotation history for each block was estimated based on prior knowledge regarding crop rotations in the Midwest (Anderson, 2005; Bruns, 2012). The resulting crop rotations and land use data are presented in Table S1.

The methodology followed for the 2004–2012 period is based on Srinivasan et al. (2010) and Teshager et al. (2016) and involved using CDL from the National Agricultural Statistics Service (USDA-CDL, 2012). These CDL layers were intersected with the watershed and classified to identify the crop rotations. For example, if a CDL cell had corn from 2004 to 2012, it was considered to be continuous corn. Based on this analysis, we found the following rotation types in our study area: continuous corn (CC), 20.2%; soybean-corn (SC), 33.12%; corn-soybean (CS), 28.46%; corn-corn-soybean (CCS), 2.54%; soybean-corn-corn (SCC), 2.06%; and continuous alfalfa, 0.14%. Other land use types included forest (2.47%), pasture (4.09%), urban (6.59%), waterbody (0.1%), and wetland (0.23%) (Table S1). Since corn and soybean yields have been increasing since the 1950s due to the introduction of hybrid crop varieties, we used three different corn and soybean crop varieties in the three time-blocks to replicate the observed yield trends.

3.6 Model Runs and Scenario Formulation

The effect of model conceptualization on time lags was evaluated by comparing the default SWAT model with two modifications of SWAT that explicitly consider lag times in the landscape. The first modification, SWAT-M, involves the use of modified C-N cycling equations to capture the accumulation of soil organic N. In the second modification, SWAT-LAG, we coupled SWAT-M with the TTD model to capture hydrologic and biogeochemical time lags. We compared all three model versions (SWAT, SWAT-M, and SWAT-LAG) to evaluate differences in performance.

For future projections, only used SWAT-LAG model, and simulated four nutrient management scenarios and three land use change scenarios for 84 years (2017–2100). Daily rainfall and temperature data corresponding to average rainfall year during the simulation period (1950–2016) was used to simulate the climate, following Muenich et al. (2016). We chose not to use future climate scenarios as our goal was to isolate the effects of landscape legacies and time lags. For the business-as-usual (BAU) scenario, the land use and management practices from 2004 to 2016 were maintained from 2017 to 2100. We considered four nutrient management scenarios in which land use was kept the same as BAU, but the fertilizer application on corn HRUs was reduced by 25% (NM1), 50% (NM2), 75% (NM3), and 100% (NM4). We also considered three land use change scenarios, in which agricultural land was converted to land under switchgrass production. These scenarios were based on the U. S. Renewable Fuel standard mandate that sets a goal of producing more than half of biofuel from cellulose-based sources (primarily switchgrass and miscanthus) by 2022. We used Shawnee switchgrass (upland cultivar) and adopted the plant growth parameters outlined in Cibin et al. (2010) and Trybula et al. (2014), and a fertilization rate of 30 kg·ha−1·year−1 to maintain switchgrass yields. The three scenarios included planting switchgrass in all agricultural HRUs (LU1), planting switchgrass only in agricultural HRUs with slopes greater than 0.5% (LU2) or greater than 1.45% (LU3).

4 Results and Discussion

4.1 Groundwater TTD for the South Fork Iowa Watershed

Modeled groundwater travel times in the South Fork Iowa watershed ranged from 36 days in some areas of the watershed to more than 50 years in others (Figure 3a). Approximately 8% of the watershed has a travel time > 50 year, which corresponds to an area of poorly drained clayey soils. Travel times are much smaller (< 10 years) around the stream network where the soil is sandier and has a greater hydraulic conductivity. The distribution of travel times was described well by an exponential distribution (Figure 3b) with a mean travel time of 13 years, which is similar to the mean travel time of 10 years estimated for the Walnut Creek Watershed in northern Iowa (Basu et al., 2012; Schilling & Wolter, 2007). The TTD obtained from this analysis was convoluted with the nitrate fluxes generated in SWAT-M to generate lagged nitrate concentrations at the catchment outlet.

Details are in the caption following the image
Groundwater travel time (a) map and (b) histogram for the South Fork Iowa River Watershed (SFIRW).

4.2 Model Calibration and Validation

We calibrated the three model versions (SWAT, SWAT-M, and SWAT-LAG) and found the calibrated parameters to be not significantly different with respect to current and past predictions of crop yield, discharge, and nitrate flux. Similarities between the three approaches are most likely due to the water-quality time series of 20 years being too short to capture legacy effects. Since the calibrated parameters were not significantly different, to ensure consistency across model versions, we used the calibrated parameters from SWAT-LAG in SWAT and SWAT-M to evaluate the goodness of fit metrics for streamflow, nitrate flux, and crop yield. All three model versions (SWAT, SWAT-M, and SWAT-LAG) performed adequately, with KGE values for streamflow and nitrate flux ranging from 0.45 to 0.77, and PBIAS varying from −11.2% to 20.2%, respectively (Figure S2 and Table 2). PBIAS values for crop yield were also good, ranging from −7.6% to 14.9% (Table 3). Despite the similarity in predicting the current loads, the models varied in their ability to predict future scenarios and time lags, which is addressed in section 4.5. The final calibrated parameters are presented in Table 1 and supporting information Table S2.

Table 2. Monthly Calibration (1996–2008) and Validation Statistics (2009–2015) for Streamflow and Nitrate Flux for Three Model Versions (SWAT, SWAT-M, and SWAT-LAG)
Variables KGE Performance PBIAS (%) Performance
Streamflow 0.61 (0.63) Moderate (Moderate) 20.2 (8.8) (Good)
Nitrate flux 0.68 (0.45) Good (Moderate) −6.5 (−7.8) Very good (Very good)
Streamflow 0.72 (0.68) Good (Good) 5.8 (−4.2) Good (Good)
Nitrate flux 0.72 (0.62) Good (Moderate) −11.2 (−9.1) Good (Very good)
Streamflow 0.72 (0.68) Good (Good) 5.8 (−4.2) Good (Good)
Nitrate flux 0.77 (0.58) Good (Moderate) 9.7 (8.0) Very good (Very good)
  • Note. (i) KGE performance criteria were based on an extensive literature review (refer section S2). (ii) PBIAS performance was based on Moriasi et al. (2015). SWAT = Soil Water Assessment Tool; KGE = Kling-Gupta efficiency; PBIAS = percent bias.
Table 3. Annual Calibration (1950–2008) and Validation Statistics (2009–2015) for Crop Yields for Three Model Versions (SWAT, SWAT-M, and SWAT-LAG)
Corn yield 4.7 (14.9) 3.1 (8.0)
Soybean yield 5.6 (2.8) 5.0 (1.4)
Alfalfa yield −7.4 (−) −7.4 (−)
Oats yield 0.7 (−) −0.0 (−)
Other hay yield −4.1 (−) −7.6 (−)
  • Note. PBIAS <±15% is considered to be good based on Srinivasan et al. (2010). PBIAS = percent bias; SWAT = Soil Water Assessment Tool.

4.3 Nitrogen Stores and Fluxes Over Time

The SWAT-LAG model was used to explore how N stores and fluxes have changed over the last 67 years (1950–2016) in the South Fork Iowa Watershed. During this period, application rates for N fertilizer and N fixation rates increased consistently from 1960 to 1970 and then stabilized (Figure 4a). This was accompanied by a concomitant increase in the crop N uptake rates, leading to an approximately constant N surplus value since the 1980s (Figures 4a and S3). Nitrogen surplus in agricultural landscapes is often used for evaluating the efficiency of the agricultural practices—an N surplus of zero indicates that the N added to the landscape has been utilized efficiently by the crops. Our analysis indicates 4-year average N surplus values ranging between 39 and 122 kg·ha−1·year−1 from 1950 to 2016. The magnitude of the N surplus is significant considering that it is 79–101% of the sum of fertilizer and manure addition (which varies from 43 to 143 kg·ha−1·year−1) to the watershed over this period.

Details are in the caption following the image
(a) Temporal trends in nitrogen surplus and its components (N deposition, fertilizer, manure, fixation, and crop N output), (b) trends in riverine output and denitrification, soil, and groundwater N accumulation and (c) watershed-scale cumulative mass balance from 1950 to 2016 across SFIRW. Note that each stacked bar and data point represents a 4-year averaged value. SFIRW = South Fork Iowa River Watershed.

A fraction of this N surplus is denitrified and lost as riverine N, while the remaining, what we hereafter refer to as legacy N, is stored in soil and groundwater (Howden et al., 2011; Van Meter et al., 2016; Worrall et al., 2015). The exact magnitudes of the denitrification flux are a challenge to estimate at the landscape scale (David et al., 2010, 2008), making it difficult to quantify magnitudes of legacy stores. In our modeling study, we find that the magnitudes of riverine N and denitrification flux increased from 1950 through the 1980s, concurrent with increases in the N surplus (Figure 4b). Nitrogen accumulation in soil and groundwater has also been increasing over this period. While the rate of accumulation of the biogeochemical legacy (soil organic N) appears to have plateaued since the last decade hydrologic legacy continues to increase (Figure 4b).

On a cumulative basis, over the 67 years (1950–2016) simulated in this study, the total N surplus over the South Fork Iowa River watershed was 6181 kg/ha (92 kg·ha−1·year−1), of which 2111 kg/ha (34%) was denitrified, 1688 kg/ha (27%) was lost as riverine output, 1588 kg/ha (25%) accumulated in the root zone, while 859 kg/ha (14%) accumulated in groundwater (Figures 4c and 4d). Our estimates of 27% of total N surplus exported as riverine output is very close to Boyer et al. (2002) estimates (25%). Our values for soil N accumulation (1588 kg/ha) fell within the range of values (931–2025 kg/ha) reported by Van Meter et al. (2016), in which they synthesized soil core data at multiple sites in Iowa. Our numbers for groundwater N accumulation (859 kg/ha) were close to groundwater nitrogen accumulations reported for the Susquehanna River basin (390–655 kg/ha) and within the same order of magnitude for the Thames River basin (503–950 kg/ha) (Worrall et al., 2015). It is interesting to note that the overall magnitude of accumulation of biogeochemical legacy is greater than the magnitude of accumulation of hydrology legacy.

4.4 Spatial Patterns in Soil Nitrogen Accumulation

We next evaluated the spatial patterns of soil nitrogen accumulation and how topography and land use characteristics contributed to accumulation. Initial soil organic nitrogen (SON) levels in the watershed followed a predictable pattern, with higher SON levels along the river network and lower values in the upland (Figure 5a). Patterns of accumulation (Figure 5b), however, were found to vary as a function not just of landscape position but numerous factors including cropping patterns, soil type, and slope. To explore this further, we explicitly compared the effect of cropping choices (continuous corn and corn-soybean rotation), soil type (Canisteo with 26% clay and Clarion with 21% clay) and topography (<3% and >3% slope), on soil N accumulation. As land use was temporally varying over our study period, we selected the last decade (2004–2016), during which land use remained constant, for this analysis.

Details are in the caption following the image
Soil organic N in SFIRW: (a) initial SON (1950), (b) soil nitrogen accumulation (SON) rates between 1950 and 2016, and (c) SON accumulation rates between 2004 and 2016 as a function of soil type (clayey Canisteo soil and sandy Clarion soil), crop rotation (continuous corn and corn-soybean), and slope. SON = soil organic nitrogen; SFIRW = South Fork Iowa River Watershed.

We found that under the continuous corn (CC) rotation, clayey Canisteo soils accumulated 4 (< 3% slope) and 6 kg·ha−1·year−1 (> 3% slope) more SON than sandy Clarion soils (Figure 5c). Similarly, under the corn-soybean (CS) rotation, clayey soils accumulated 3 kg·ha−1·year−1 (for both slope categories) more than sandy soils. The greater accumulation of organic nitrogen in clayey soils is consistent with existing literature that reports greater organic content and slower organic matter decomposition rates in clays (Legg, 2017; McLauchlan, 2006; Six et al., 2002). We also found that land under CC accumulates 7 (< 3% slope category) and 8 kg·ha−1·year−1 (> 3% slope category) more SON than land under a CS rotation in clayey soils, and 6 (< 3% slope category) and 5 kg·ha−1·year−1 (> 3% slope category) more than land under CS in sandy soils (Figure 5c). This phenomenon is consistent with the findings of Varvel (1994) and Jagadamma et al. (2007) where continuous corn (CC) rotation led to a greater increase in the soil total nitrogen concentrations. This most likely arises due to (i) high fertilizer application rates in CC, (ii) high C/N ratio of corn residue (DeJong Hughes & Coulter, 2009; Jagadamma et al., 2007) that has more resistance toward microbial decomposition, and (iii) more residue contribution in CC rotation than in CS rotation (DeJong Hughes & Coulter, 2009; Varvel, 1994). Finally, we found that flatter areas of the landscape (slope < 3%) accumulate more organic nitrogen, possibly due to lower slopes leading to less N loss to the stream.

4.5 Time Lags in Watershed Response

There was no significant difference in performance between the three models for current and past simulations (Tables 2 and 3). It is important to note that even though baseline model performance is not different between the model versions, getting the time lags correct is important when the model is used to predict future scenarios of water quality due to land management. It is important for policymakers to know these time lags to be able to set realistic policy goals. Indeed, when a management change was implemented in SWAT-LAG for the forecast period (2017–2100), the three models responded differently (Figure 6). The management scenario implemented was a 100% reduction in fertilizer application rate. The original SWAT model (SWAT) responded by predicting a 46% reduction in nitrate load within 1 year of the change (Figure 6a), while the N loads generated by the SWAT-M and SWAT-LAG models remained consistently high for multiple decades. The lag times to achieving a 50% N load reduction were, respectively, 1 year for SWAT, 4 years for SWAT-M, and 19 years for SWAT-LAG (Figure 6b), while the times to achieving a 70% N load reduction were 2 years for SWAT, 22 years for SWAT-M, and 62 years for SWAT-LAG. However, regardless of the different lag times, all models predicted an approximately 75% nitrate load reduction by the year 2100 (84 years after the land use change, Figure 6b). Significant uncertainty lies in the estimation of these time lags, given the uncertainty in parameters like the travel time, as well as inputs like fertilizer application rates. This is captured in the 95 PPU band predicted around the lag time (Figure 6c) that shows, for example, that in 2025 that N load predicted by SWAT-LAG can vary between 277 and 1,354 tons/year, with the median value being equal to 827 tons/year.

Details are in the caption following the image
(a) Stream N load and (b) N load reduction as a function of time, simulated by SWAT, SWAT-M, and SWAT-LAG for NM4 (100% fertilizer reduction in corn HRUs). (c) Uncertainty in stream N load prediction quantified by SWAT-LAG, as captured by the 95% prediction band (95 PPU). The stream nitrate load reduction trajectory was obtained by subtracting N load for the NM4 scenario from that of the BAU scenario. Longer lag times are observed for the SWAT-LAG and SWAT-M scenarios, compared to the SWAT scenario. SWAT = Soil Water Assessment Tool; BAU = business as usual.

SWAT showed the greatest load reduction soon after the land management change, as the C-N module in SWAT is not sophisticated enough to capture how the organic matter responds to changes in management practices. Addition of the more sophisticated organic matter dynamics in SWAT-M leads to a greater lag time than that obtained with SWAT—this is what has been referred to before as the biogeochemical time lag (Hamilton, 2011; Van Meter et al., 2016). Finally, the greatest lag time is observed in SWAT-LAG due to the additional consideration of the nutrient buildup in the groundwater reservoir—this has been referred to in section 1 as the hydrologic time lag (Meals et al., 2010; Schilling et al., 2008). The time lag predicted by SWAT-LAG, which includes both the hydrologic and biogeochemical time lag, is critical for predicting changes in water quality after changes in management, making it possible for policymakers to make appropriate decisions and also to manage expectations among stakeholders.

4.6 Effects of Land Management on Nitrogen Time Lags

We next evaluated the effects of different land use and land management on the time required to see reductions in N loads at the catchment outlet, using the modified SWAT-LAG model. In the first set of scenarios, NM1, NM2, NM3, and NM4, fertilizer application on all corn HRUs was reduced by 25%, 50%, 75%, and 100%, respectively, which corresponded to 26%, 44%, 63%, and 80% decreases in the N surplus at the watershed scale. These reductions led to reductions in the stream nitrate load in all cases; however, time lags between the reductions in input and subsequent load reductions at the outlet differed across the different fertilizer reduction scenarios (Figure 7a). The relationship between reductions in fertilizer application rates and N load reductions was linear, but the slope of the relationship increased over time indicating that for the more extreme scenario (NM4), waiting for longer leads to proportionally greater benefits compared to a less extreme scenario like NM2 (Figure 7b). For the 25% reduction in fertilizer application rate scenario, the N load reduction is ~ 14% in 2025, while it is 23% in 2050. For the 100% reduction in fertilizer application rates, the N load reduction is ~ 45% in 2025, while it is 70% in 2050. A 40% reduction in N load can be achieved by a 100% reduction in fertilizer application within 7 years after implementation, whereas if one is willing to wait 30 years the same load reduction can be achieved with only 50% reduction in fertilizer application rates.

Details are in the caption following the image
Effect of land use and land management on time lags: (a) Percent N load reduction trajectories for four management scenarios where nitrogen fertilizer application rates on corn HRUs were reduced by 25% (NM1), 50% (NM2), 75% (NM3), and 100% (NM4). (b) Percent N load reduction as a function of percent fertilizer reduction in 2025 and 2050. (c) Percent N load reduction trajectories for three land management scenarios where 100% (LU1), 74% (LU2), and 61% (LU3) of agricultural lands were planted with switchgrass and the rest of row crop HRUs followed BAU, (d) Percent N load reduction as a function of percent land use change in 2025 and 2050. HRUs = Hydrologic Response Units; BAU = business as usual.

In the second set of scenarios, switchgrass was planted in the agricultural HRUs, thus reducing both the N surplus (LU1, 86%; LU2, 52%; and LU, 337%) and N loads at the catchment outlet (Figures 7c and 7d). Much greater magnitudes of N reduction were evident under the switchgrass scenario, as switchgrass is actively harvested and removed from the system, thus reducing the amount of residue in the field that would serve as a source of mineralizable N. While corn was also harvested, corn yields were poor under the extreme fertilizer reduction scenarios, thus decreasing their effectiveness as an N sink. Under LU1, when all agricultural HRUs were planted with switchgrass, 67% N load reduction was achieved in 2025, while an 81% reduction in N load was attained in 2050 (Figure 7d). As can be seen in the figure, the relationship between percent land use change and N load reduction is linear and the slope increases over time.

The above two paragraphs highlight that there are fundamental tradeoffs between the time required to achieve a given N load reduction and the magnitude of the N surplus reduction. For example, the Watershed Nutrient Task Force sets a goal to reduce the 5-year average area of the hypoxic zone in the gulf to less than 5,000 km2 by 2035, which corresponds to N load reduction of 60% for the Mississippi River Watershed (Scavia et al., 2017). Our results indicate that in the SFIRW it would theoretically be possible to achieve the target loads by 2035 with N fertilizer reductions of 100% (red line in Figure 8a) or with the conversion of 86% of land under row crop to switchgrass (Figure 8). However, if the target year was extended by another 32 years to 2067, less drastic reductions in fertilizer application (75%) or changes in land use (77%) would be necessary. In other words, if we are willing to wait longer to achieve water-quality goals, less aggressive measures to improve nutrient management can be pursued. Conversely, the more we reduce nutrient inputs now, the faster we can reduce nitrate concentrations to desired levels.

Details are in the caption following the image
Cost-time tradeoffs in achieving reductions in N loads as a function of (a) reduction in fertilizer application and (b) conversion of land under row crop agriculture to switchgrass. The contour lines represent percent reductions in N loading at the watershed outlet. The red and black arrows in Figure 8a show that it may take between 18 and 50 years to achieve a 60% reduction in N loading, depending upon the magnitude of fertilizer reduction. The red and black arrows in Figure 8b show that it may take between 18 and 50 years to achieve a 60% reduction in N loading as a function of the magnitude of land use conversion.

5 Conclusion

The buildup of nitrogen legacies in agricultural landscapes has been linked to time lags between changes in land management and measurable improvements in stream water quality (Chen et al., 2015; Van Meter & Basu, 2015; Van Meter et al., 2016, 2017). However, current watershed models do not explicitly consider the accumulation and depletion of legacies and the corresponding time lags in water-quality improvement (Van Meter et al., 2016, 2017). Here we modified the water-quality model SWAT to capture time lags in water-quality response that arise from the accumulation of legacy nitrogen. The new SWAT-LAG includes (1) a modified carbon-nitrogen cycling module to capture the accumulation and depletion of soil organic N, and (2) a groundwater TTD module to capture the dynamics of the groundwater nitrate store. While the focus of this paper is on the modification of the SWAT model, a similar approach could be used to modify any spatially distributed watershed model to account for legacies.

We used as a case study a 502 km2 agricultural watershed in north-central Iowa and ran the model from 1950 to 2016 to capture the buildup of legacy nitrogen in the landscape. Our results show that over the 67 years simulated in the study, the cumulative N surplus in the landscape was 6,181 kg/ha, of which 2111 kg/ha (34%) was denitrified, and 1,688 kg/ha (27%) was lost as riverine output, while 1,588 kg/ha (25%) accumulated in the root zone, and 859 kg/ha (14%) accumulated in the groundwater as legacy N. The SWAT-LAG model was able to describe the time lag in the landscape response—a 100% reduction in fertilizer application led to a 79% reduction in stream N load, and reduction was achieved in 84 years in SWAT-LAG, while it took only 2 years in the original SWAT model. Varying land use and land management practices impacted both the final reduction in N load that could be achieved, as well as the time taken to achieve the load reduction. It is thus important to recognize the trade-offs between costs incurred for a particular land use change and the water-quality benefits achieved. Larger changes in land use or greater implementation of new management practices come with a greater cost but can lead to a faster rate of achievement of water-quality benefits. Of course, other routes to faster achievement of water-quality goals may exist. More specifically, it has been shown that when implementation of conservation measures is spatially targeted to areas identified as having faster travel times to the catchment outlet, stream concentrations are reduced more quickly Van Meter and Basu (2015). Alternatively, end-of-pipe solutions, such as restoration of riverine wetlands that can intercept legacy nitrate loads before they enter the stream, could be used to directly reduce legacy-related loads via denitrification.

It is important to note that there is significant uncertainty in the actual magnitude of lag times estimated in our model, given a lack of long-term data sets for model validation. While nitrogen surplus information is available for longer periods of time, water-quality data are often available for <20 years. In a recent paper, Van Meter et al. (2018) modeled water quality in the MRB over the past 100 years and used sediment core information, preserved in the Gulf, for model validation. Furthermore, the water-quality data available at the outlet of MRB are for 40 years and clearly demonstrate the effect of legacy stores. Specifically, data at the outlet of the MRB show that inputs have been decreasing since the 1970s, but water quality has remained stable, which is clearly due to the presence of legacy (Van Meter et al., 2018). A similar effect was also observed at the Wapello outlet of Iowa Cedar River basin (Figure S4). Unfortunately, such long-term data sets are not most commonly available. Furthermore, even when such data sets are available, equifinality in model predictions makes it difficult to identify lag times only through model fits. Often information content in the output time series is not enough to distinguish between models considering lag times and models not including lag times. It should be noted, however, that the groundwater travel times assumed in the current model are based on calculated travel times for a similar Iowa watershed (Basu et al., 2012).

Despite such uncertainties, lag time models like SWAT-LAG must be developed to provide policymakers and regulators with realistic time frames for recovery of water quality. To constrain such models, one needs to use ancillary data sets, like soil nitrogen accumulation, and sediment records. Given the lack of data, uncertainty in model predictions is a function of uncertainty in the estimation of groundwater travel times and reaction rates. Future work would thus include using various tracers to constrain groundwater TTDs, as well as using various ancillary data sets like sediment cores for model validation.

Data Availability

Data used are publicly available and are cited in the references. Also, they are accessible through links provided below: Agricultural Census data were retrieved from https://www.agcensus.usda.gov/Publications/2012/. Agricultural Survey data were retrieved from https://quickstats.nass.usda.gov/?source_desc=CENSUS/. Crop data layers were retrieved from https://nassgeodata.gmu.edu/CropScape/. Digital elevation model data were retrieved from http://ned.usgs.gov/. Stream water discharge and water-quality data were retrieved from http://waterdata.usgs.gov/nwis/. County scale N fertilizer sales data were retrieved from http://water.usgs.gov/GIS/dsdl/sir2012-5207_county_fertilizer.zip.


This work was supported, in part, by funds from the National Science Foundation Coupled Natural and Human Systems program, grant 1114978. Financial support for the present work was also provided from startup funds of N. B. Basu at the University of Waterloo. Additional funds were provided from an NSERC Discovery Grant and the Early Researcher Award awarded to N. B. Basu.