Volume 35, Issue 17
Free Access

Combined surface solar brightening and increasing greenhouse effect support recent intensification of the global land-based hydrological cycle

Martin Wild

Martin Wild

Institute for Atmospheric and Climate Science, ETH Zurich, Zurich, Switzerland

Search for more papers by this author
Jürgen Grieser

Jürgen Grieser

Risk Management Solutions, London, UK

Search for more papers by this author
Christoph Schär

Christoph Schär

Institute for Atmospheric and Climate Science, ETH Zurich, Zurich, Switzerland

Search for more papers by this author
First published: 11 September 2008
Citations: 161


[1] The surface net radiation (surface radiation balance) is the key driver behind the global hydrological cycle. Here we present a first-order trend estimate for the 15-year period 1986–2000, which suggests that surface net radiation over land has rapidly increased by about 2 Wm−2 per decade, after several decades with no evidence for an increase. This recent increase is caused by increases in both downward solar radiation (due to a more transparent atmosphere) and downward thermal radiation (due to enhanced concentrations of atmospheric greenhouse-gases). The positive trend in surface net radiation is consistent with the observed increase in land precipitation (3.5 mmy−1 per decade between 1986 and 2000) and the associated intensification of the land-based hydrological cycle. The concurrent changes in surface net radiation and hydrological cycle were particularly pronounced in the recovery phase following the Mount Pinatubo volcanic eruption, but remain evident even when discarding the Pinatubo-affected years.

1. Introduction

[2] Knowledge of variations in the intensity of the global hydrological cycle is still limited, despite its obvious environmental and societal importance [e.g., Ramanathan et al., 2001; Ohmura and Wild, 2002; Liepert et al., 2004]. On global scales, the principal driver of the hydrological cycle is the radiative energy available at the Earth's surface (here referred to as surface net radiation), which is determined by the sum of the absorbed solar and net thermal radiative exchanges. In the global mean, the distribution of radiative energy in the climate system yields a positive radiation balance at the surface and a negative balance in the atmosphere. This is compensated by fluxes of latent heat and, to a lesser extent, sensible heat from the surface into the atmosphere. Latent heat is the energy equivalent of evaporation, which equals precipitation in the global annual mean. Variations in the surface radiation balance therefore induce changes in precipitation and thereby govern the intensity of the global hydrological cycle. Thus, knowledge of changes in the surface radiation balance can improve our understanding of the variations in the intensity of the hydrological cycle. In the following, we derive first-order estimates of changes in the individual components of the radiation balance over land surfaces from the mid-1980s to 2000, to provide more insight into recent changes in the radiative forcing of the hydrological cycle. This period is particularly interesting, as it covers a phase where observations indicate a widespread increase in surface solar radiation, known also as “surface solar brightening” [Wild et al., 2005, 2007]. The surface radiative forcing and hydrology during the preceding decades 1960s to 1980s, where in contrast a widespread decline of surface solar radiation was observed [Gilgen et al., 1998; Stanhill and Cohen, 2001; Liepert, 2002], was the focus of an earlier study [Wild et al., 2004].

2. Datasets

[3] Widespread direct measurements of radiation reaching the land surface started in the 1960s. Many of these observations are stored in a database of worldwide measured surface energy fluxes, the Global Energy Balance Archive (GEBA) [Gilgen et al., 1998]. This database currently contains 450,000 surface energy flux entries in the form of monthly means from more than 2500 observation sites.

[4] Highest quality radiation measurements from currently 35 sites in different climate zones are available from the Baseline Surface Radiation Network (BSRN) [Ohmura et al., 1998]. BSRN data, however, do not cover the entire period under consideration here, since the earliest sites started only in the 1990s.

[5] A new dataset for precipitation over land surfaces has been compiled at the Global Precipitation Climatology Centre (GPCC) of the German Meteorological Service (DWD) [Grieser and Beck, 2006]) This dataset, based on 9343 extended station records, is specially designed for the analysis of decadal and interannual variations, since only quality-checked long-term and homogeneity-tested precipitation records were included.

3. Results

3.1. Surface Radiation and Hydrological Cycle in the “Brightening” Period 1986–2000

[6] The 15-year period 1986–2000 is characterized by a levelling off or reversal of the former decline in surface solar radiation at many observation sites [Wild et al., 2005]. In addition, a significant radiative forcing occurred in 1991–1993 due to the volcanic eruption of Mount Pinatubo, which strongly affected the radiative fluxes for a couple of years [e.g., Wielicki et al., 2002; Trenberth and Dai, 2007]. In the present section we estimate the changes in surface net radiation excluding Pinatubo influences (see the first and second columns of Table 1). Section 3.2 will later address the Pinatubo influences (third column of Table 1).

Table 1. Estimated Trends in Energy and Water Fluxes Over Land Surfaces During the “Brightening Period” 1986–2000a
Units 1986–2000 1996–2000 Versus 1986–1990b 1992–2000
Surface Energy Fluxes
a) Change in downward solar radiation Wm−2 y−1 +0.22 +0.23 +0.66
b) Inferred change in absorbed surface solar radiation Wm−2 y−1 +0.17 +0.17 +0.49
c) Change in downward thermal radiation Wm−2 y−1 +0.21 +0.21 +0.26
d) Change in upward thermal radiation Wm−2 y−1 −0.17 −0.15 −0.20
e) Change in surface net radiation: b + c + d Wm−2 y−1 +0.21 +0.23 +0.55
f) Change in ground heat flux Wm−2 y−1 −0.001 −0.001 −0.001
g) Change in melt Wm−2 y−1 −0.02 −0.02 −0.02
h) Inferred change in turbulent fluxes: e − f − g Wm−2 y−1 +0.19 +0.21 +0.53
i) Corresponding change in annual evaporationc mm y−1 +2.4 +2.6 +6.6
Water Fluxes
j) Change in annual land precipitation from GPCC mm y−1 +3.5 +3.2 +9.1
k) Corresponding change in annual evaporationd mm y−1 +2.4 +2.2 +6.1
l) Equivalent latent heat flux change Wm−2 y−1 + 0.19 +0.17 +0.5
  • a The estimates are determined as linear trends during the period 1986–2000 after discarding the Pinatubo-affected years 1991–1993 (first column), and based on differences between the 1996–2000 and 1986–1990 periods (second column). The third column shows the linear trends in the Pinatubo-affected period 1992–2000 for comparison. Energy gain for the surface is signed positive.
  • b For comparison the differences between the two periods are expressed here as trends (absolute values are given in the text).
  • c Assuming unchanged sensible heat flux.
  • d Assuming a constant precipitation/runoff ratio.

[7] We analysed these changes in two alternative ways: First, we calculated linear trends over the period 1986–2000, discarding the Pinatubo-affected years 1991–1993. Second, we compared averages over two five-year periods representing the late 1980s (1986–1990) and late 1990s (1996–2000).

[8] The surface radiation balance consists of the absorbed solar radiation on the one hand, and the difference between upward and downward thermal radiation on the other hand. In the following we estimate changes in each of these components to derive the overall change in surface net radiation. To estimate changes in downward surface solar radiation (SWD) between the 1986–1990 and 1996–2000 periods, we had information from 365 worldwide distributed sites from GEBA. The average over all station data within 1986–1990 is 152.3 Wm−2, while the data within 1996–2000 averages to 154.6 Wm−2, corresponding to an increase in 2.3 Wm−2 over this period. For comparison with the other table entries, this is expressed as a trend of 0.23 Wm−2 y−1 in Table 1 (second column). Linear trends for the period 1986–2000, excluding 1991–93, were determined at 332 stations and show an average increase of 0.22 Wm−2 y−1 (Table 1, first column). The corresponding increase in absorbed rather than downward surface solar radiation is then 1.7 Wm−2 from 1986–1990 to 1996–2000, or 0.17 Wm−2 y−1, considering a satellite-derived surface albedo of 25% over land. Alternatively, one can first construct a composite time series from the 332 stations, and then determine its linear slope. This results in slightly stronger increases of 0.26 Wm−2 y−1 for the downward and 0.20 Wm−2 y−1 for the absorbed solar radiation, respectively.

[9] The estimates from the GEBA database may suffer from the station distribution, which is biased towards higher latitudes. Estimates of low latitude changes can be deduced from the non-scanner Earth Radiation Budget Satellite (ERBS) instrument, the only continuous broadband satellite measurement over the period considered [Wielicki et al., 2002; Wong et al., 2006]. From their publications we estimate a decrease in top of atmosphere reflected solar radiation of 2.5 Wm−2 between 1986–1990 and 1996–2000 in the Tropics, which by itself would induce a proportional increase in SWD close to 2 Wm−2 [Wild et al., 1997]. This is comparable to the GEBA estimates above. Other estimates using satellite products to determine surface fluxes suggest an increase of similar magnitude (e.g., Hatzianastassiou et al. [2005] with an increase in SWD of 0.24 Wm−2 y−1 from 1984 to 2002). We therefore consider an increase in SWD slightly above 0.2 Wm−2 y−1 as best estimate over the period 1986–2000 (Table 1).

[10] Observations on downward thermal radiation (LWD) are only available since the early 1990s. Using the ERA40 reanalysis [Uppala et al., 2005] we determined an increase in LWD of 0.21 Wm−2 y−1 during 1986–2000, or 2.1 Wm−2 from 1986–1990 to 1996–2000. ERA40 currently provides the best estimate of the atmospheric temperature and humidity structure, which is essential for an accurate determination of LWD. In addition, ERA40 incorporates an advanced radiation scheme (RRTM) [Clough et al., 2005] that was shown to simulate LWD very accurately and removed some of the long-standing biases found in many GCMs [Wild and Roeckner, 2006]. An adequate representation of column water vapor and clear-sky thermal radiation variations over land in ERA40 was also noted by Allan [2007], and a good reproduction of 2 m-temperature trends by Simmons et al. [2004] for the period under consideration here. Both studies thus further support the use of LWD estimates from ERA40 for the present purpose. Prata [2008] estimated a slightly lower increase in LWD at 0.17 Wm−2 y−1 over the earlier period 1964–1990, based on comprehensive radiative transfer calculations and temperature and humidity profiles from radiosondes. A slightly higher estimate (0.26 Wm−2 y−1) based on surface observations for the later period 1992–2002 is given in section 3.2. This gradual enhancement of the increase in LWD suggested by the 3 above estimates from +0.17 Wm−2 y−1 (1964–1990), to +0.21 Wm−2 y−1 (1986–2000), to +0.26 Wm−2 y−1 (1992–2000) is in line with the expectations from greenhouse theory and transient model simulations [Wild et al., 1997]. The three consistent and independent estimates therefore give support for a best estimate of LWD changes near 0.2 Wm−2 y−1 during 1986–2000 (Table 1).

[11] The upward component of surface thermal radiation can readily be determined by the Stefan-Boltzmann law using observed 2-meter temperatures, which well approximate surface temperature changes on the timescales considered here. Using the 2-meter temperature dataset provided by the Climate Research Unit (CRU) [Mitchell and Jones, 2005], we estimated a linear enhancement in upward thermal radiation of −0.17 Wm−2 y−1 over the period 1986–2000 (Table 1; negative sign corresponds to enhanced energy loss of the surface), or an enhancement of −1.5 Wm−2 from 1986–1990 to 1996–2000.

[12] Thus, the overall change in surface net radiation from 1986–1990 to 1996–2000 adds up to +2.3 Wm−2, or to a linear trend of +0.21 Wm−2 y−1 during 1986–2000 (Table 1). To satisfy the surface energy balance, this increase in available radiative energy at the surface has to be redistributed within the non-radiative components of the surface energy balance, i.e. the turbulent fluxes of latent and sensible heat, the ground heat flux and melt. The total change in the latter two terms is small over global land surfaces, on the order of 0.02 Wm−2 y−1 over the period considered here [Ohmura, 2004]. This leaves about 2.1 Wm−2 of additional energy for the turbulent fluxes in 1996–2000 compared to 1986–1990, or an additional 0.19 Wm−2 y−1 during 1986–2000 (Table 1). With respect to the partitioning of this additional energy into latent and sensible heat flux, a decrease in the Bowen ratio (ratio between sensible and latent heat flux) with increasing temperature was determined by Ohmura [1984], thereby favoring changes of the latent over the sensible heat flux in a warming climate. This is in line with various climate model experiments for greenhouse scenarios, which show no increase in sensible heat fluxes with warming, but consistently rather a slight decrease [e.g., Gutowski et al., 1991; Wild et al., 1997; Roeckner et al., 1999; Liepert et al., 2004]. On the other hand, the recent reduction of absorbing aerosols [Streets et al., 2006] and associated surface warming and atmospheric cooling may have helped to counteract this decrease, as can be inferred e.g. from the GCM experiments in Roeckner et al. [2006]. Anyway, the changes in sensible heat fluxes are generally small compared to the changes in the latent heat flux in accord with their smaller absolute values, and further constrained by compensating factors as outlined above, and therefore maybe negligible. This suggests that the majority of the additional radiative energy available at the surface determined above has been balanced by a corresponding change in latent heat flux of 0.19 or 0.21 Wm−2 y−1, respectively, implying an increase of annual evaporation by 2.4 or 2.6 mm y−1 (Table 1, first and second columns).

[13] Such an increase in evaporation should have implications for the intensity of the hydrological cycle. Figure 1 shows land mean precipitation over the period considered according to the homogeneity-tested GPCC precipitation dataset. From this dataset, we estimated a linear increase in annual land precipitation of 3.5 mm y−1 over the period 1986–2000, excluding the Pinatubo years 1991–1993, or an increase of 32 mm between 1986–1990 and 1996–2000, expressed in Table 1 as trend of 3.2 mm y−1. This increase in land precipitation can stem from either corresponding increases in land evaporation, or from increased moist air advection between ocean and land [Schär et al., 1999]. To account for the latter we assume as a first order approximation, that the fraction of the precipitation that goes into runoff remained constant over our investigation period. This is equivalent to the assumption that the relative portion of precipitation originating from large-scale moisture convergence over the global land surfaces and from land evaporation, respectively, has not changed substantially. We judge this as the most plausible hypothesis considering the lack of related observational information [Vörösmarty, 2002] and the global scale of our analysis. The hypothesis is also supported by model simulations, which show near constant ratios between land precipitation, evaporation and runoff in climate change experiments. Taking a best estimate of the fraction of runoff from land precipitation to be 33%, this implies that the land evaporation change amounts to 67% of the land precipitation change, corresponding to trends in annual evaporation of 2.4 and 2.2 mm y−1, respectively.

Details are in the caption following the image
Observed annual mean precipitation over global land surfaces, obtained from the homogenized GPCC dataset specially designed for the analysis of temporal changes in precipitation data [Grieser and Beck, 2006].

[14] This evaporation change inferred from observed precipitation is surprisingly similar in magnitude to the evaporation change estimated above from independent surface energy balance considerations (see Table 1). The close match in the changes in the hydrological cycle and the driving surface radiation balance suggests that these changes may not be purely artifacts, but are probably realistic in their magnitude. This suggests that the recent concurrence of surface solar “brightening” and increased thermal greenhouse effect may have jointly and similarly contributed to the intensification of the hydrological cycle since the mid-1980s.

[15] In contrast, in prior decades (1960s to 1980s), the surface net radiation showed no evidence for an increase, as we estimated in a previous study [Wild et al., 2004]. In this period, the greenhouse-induced increase in the downward thermal radiation appears to have been outweighed by the decreasing surface solar radiation (“global dimming”), thereby suppressing an intensification of the hydrological cycle. This is in line with observational evidence, which suggests a decrease rather than an increase in precipitation over this period.

3.2. Surface Radiation and Hydrological Cycle in the Post Pinatubo 1990s

[16] We now focus on the most recent period beginning with 1992 where BSRN radiation data started to become available. This period is interesting as it shows a marked increase in global land precipitation, from 746 mm in 1992 almost linearly to 832 mm in 1999 (Figure 1). The corresponding linear trend over the 1990s amounts to +9.1 mmy−1 [Grieser and Beck, 2006]. The hydrological cycle in the early part of this period has been strongly affected by the volcanic eruption of Mount Pinatubo as evidenced by Trenberth and Dai [2007]. Assuming again a constant runoff/evaporation ratio, the above precipitation increase would imply an increase in annual land surface evaporation of 6.1 mm y−1, equivalent to an increase in latent heat flux of 0.5 Wm−2 y−1 (Table 1, third column). Therefore, the surface net radiation, which was strongly influenced by the radiative forcing of the Pinatubo eruption, would be expected to have increased by a similar magnitude over this period. As in the previous section, we estimate the changes in the individual components of the surface net radiation also for the post Pinatubo phase.

[17] Downward thermal radiation changes can now be estimated from direct measurements, which started to become available during the early 1990s from the BSRN network. For the period considered now, 1992–2000, 12 stations can be used from BSRN, covering a latitude range from polar to tropical regions. The average change, calculated as mean over linear slopes at the stations, is 0.26 Wm−2 y−1 (Table 1). This is in line with surface thermal fluxes simulated in a transient GCM experiment [Roeckner et al., 1999], from which we derive an increase of 0.24 Wm−2 y−1 over global mean land surfaces over the period considered here, or 0.25 Wm−2 y−1, if we consider only the average over the fluxes simulated at the BSRN stations. This enhances confidence in the larger-scale representativeness of the average change at the observation sites. For the upward thermal radiation, using again the CRU surface temperature dataset and the Stefan Boltzmann law, we estimate an increase of 0.20 Wm−2 y−1 (Table 1). The net thermal contribution to the change in surface net radiation is therefore estimated at 0.06 Wm−2 y−1, which is still small compared to the change in latent heat flux/surface net radiation of 0.5 Wm−2 y−1 estimated above from hydrological considerations. The largest contribution to the change in net radiation must therefore come from the shortwave component. Wild et al. [2005] estimated SWD to have increased by 0.66 Wm−2 y−1 over the period 1992–2001, based on 8 sites from BSRN. We deduce a very similar increase of 0.65 Wm−2 y−1 from a high quality network with 5 remote stations maintained by NOAA [Dutton et al., 2006]. From an analysis of the diffuse and direct components at the BSRN sites, it becomes evident that this large increase in SWD is at least partly a result of the recovery from the Pinatubo forcing (C. Long and M. Wild, manuscript in preparation, 2008). The above increase in SWD corresponds to an increase in absorbed solar radiation of 0.49 Wm−2 y−1 over land surfaces, assuming again a mean land albedo of 25%. Although the spatial representativeness of this limited number of sites might be questionable, this value is of similar magnitude to the global changes in surface absorbed solar radiation derived from satellite data for the post Pinatubo period [see, e.g., Hatzianastassiou et al., 2005, Figure 6a]. It is also in quantitative agreement with the anomalies in top of atmosphere solar reflectance in the post Pinatubo period seen by the ERBS instrument [Wong et al., 2006]. With the above changes in net shortwave radiation of 0.49 Wm−2 y−1 and net thermal radiation of 0.06 Wm−2 y−1, the change in surface net radiation adds up to +0.55 Wm−2 y−1 (Table 1, third column). This is again in notably good agreement with the change in surface net radiation/latent heat flux inferred from the independent hydrological considerations above (+0.5 Wm−2 y−1). Overall this indicates that the substantial changes in surface net radiation in the 1990s and related changes in the hydrological cycle, are dominated by substantial changes in surface solar radiation, which were at least partly influenced by the post Pinatubo recovery.

4. Conclusions

[18] Observational evidence for an intensification of the hydrological cycle over land since the mid-1980s has been provided from both surface energy and water balance perspectives. Consistent estimates of first-order changes in surface radiation balance and precipitation were derived from independent observational datasets. The estimated changes suggest that the recent concurrent increase in both solar and thermal radiative heating at land surfaces has favored the intensification of the hydrological cycle since the mid-1980s. In contrast, in prior decades, surface solar dimming counteracted the thermal radiative heating and suppressed an acceleration of the hydrological cycle [Wild et al., 2004]. The increase in surface radiative energy available for the hydrological cycle over recent years may therefore have been a consequence of the increasing thermal greenhouse capacity of the atmosphere on one hand, and the increasing transparency of the atmosphere for solar radiation on the other hand. This increasing transparency is at least partly attributed to a widespread decrease of anthropogenic aerosol loadings since the mid-1980s, due to more effective air pollution measures and the collapse of the economy of the former Soviet Union [Wild et al., 2005; Streets et al., 2006]. The associated changes in direct and indirect aerosol effects [e.g., Rosenfeld, 2000; Ramanathan et al., 2001] are in line with the changes in surface radiation and hydrology presented here.

[19] Recent studies show evidence that climate models driven by estimated historical forcings simulate smaller increases in precipitation than observed [Zhang et al., 2007; Wentz et al., 2007; Allan and Soden, 2007]. The present study suggests that the recent brightening effect, which may not be fully reproduced in these simulations, could account for some of these discrepancies.

[20] The main limitation of the current study is its focus on land surfaces in absence of adequate data over oceans, which prevent a global assessment of the intensification of the hydrological cycle. Despite this limitation it is encouraging to see that land-based data alone yield an internally consistent picture of recent radiation and precipitation trends. More detailed studies are envisaged to understand interactions between the variations in surface net radiation and the hydrological cycle on more regional scales.


[21] A. Ohmura is highly acknowledged for his valuable input to this study. This study is supported by the National Center for Competence in Climate Research (NCCR Climate).