Climate Policy

MIT Joint Program Co-Director John Reilly, former U.S. Vice President Albert Gore and other experts explore extreme implications of climate change in Meltdown Earth, a video in the NowThis: Apocalypse online series. The video description reads: "Rising ocean waters, scorching temperatures, food scarcity, and disease – here's how humans could ultimately be responsible for the end of the world."  

Extreme precipitation events pose a significant threat to public safety, natural and managed resources, and the functioning of society. Changes in such high-impact, low-probability events have profound implications for decision-making, preparation and costs of mitigation and adaptation efforts. Understanding how extreme precipitation events will change in the future and enabling consistent and robust projections is therefore important for the public and policymakers as we prepare for consequences of climate change.

Projection of extreme precipitation events, however, particularly at the local scale, presents a critical challenge: the climate model-based simulations of precipitation that we currently rely on for such projections—general circulation models (GCMs)—are not very realistic, mainly due to the models’ coarse spatial resolution. This coarse resolution precludes adequate representation of highly influential, small-scale features such as moisture convection and topography. Regional circulation models (RCMs) provide much higher resolution and better representation of such features, and are thus often perceived as an optimum approach to producing more accurate heavy precipitation statistics than GCMs. However, they are much more computationally intensive, time-consuming and expensive to run.

In a previous paper, the researchers developed an algorithm that detects the occurrence of heavy precipitation events based on climate models’ well-resolved, large-scale atmospheric circulation conditions associated with those events—rather than relying on these models’ simulated precipitation. The algorithm’s results corresponded with observations with much greater precision than the model-simulated precipitation.

In this paper, the researchers show that using output from RCMs rather than GCMs for the new algorithm does not improve the precision of simulated extreme precipitation frequency. The algorithm thus presents a robust and economic way to assess extreme precipitation frequency across a broad range of GCMs and multiple climate change scenarios with minimal computational requirements.   

 

Abstract:

Establishing a credible and effective transparency regime to support the Paris Agreement – broader than its formal ‘transparency framework’ – will be both crucial and challenging. The Agreement provides for review of achievements under national pledges (Nationally Determined Contributions, or NDCs), but much of this information will become available only well after key steps in the launch of this latest attempt to control human influence on the climate. Still, in these early years, information and understanding of individual and collective performance, and of relative national burdens under the NDCs, will play an important role in the success or failure of the Agreement. However, because of the phasing of various steps in the 5-year cycles under the Agreement and the unavoidable delays of two or more years to produce and review government reports, the Climate Convention and other intergovernmental institutions are ill-suited to carry out timely analyses of progress. Consequently, in advance of formal procedures, academic and other non-governmental groups are going to provide analyses based on available data and their own methodologies. The article explores this transparency challenge – using the MIT Economic Projection and Policy Analysis (EPPA) model to construct sample analyses – and considers ways that efforts outside official channels can contribute to the success of the Agreement.

Key policy insights:

  • Because key national decisions are faced before full implementation of the transparency framework, being negotiated by the Ad-Hoc Working Group on the Paris Agreement (APA), urgent attention is needed to activities supporting the regime’s system of pledge and review.

  • Outcomes of these APA negotiations, explored here, including features of reported NDCs and guidelines for tracking progress, will influence the effectiveness of the Agreement in encouraging greater mitigation effort.

  • Whatever the outcome of the APA negotiations, studies by academic and other non-governmental analysis groups will in the near term have a particularly great influence on the transparency objectives of the Agreement.

  • Challenges to the provision by these groups of clear, coherent, credible analyses are explored, leading to recommendations for improved documentation of methods and standards of practice in analysis.

[Executive Summary: 150 kB] [Appendix C: 1 MB] [Appendix D: 650 kB] [Data Tables: 500 kB]

The MIT Emissions Prediction and Policy Analysis model is applied to an assessment of a set of cap-and-trade proposals being considered by the U.S. Congress in spring 2007. The bills specify emissions reductions to be achieved through 2050 for the standard six-gas basket of greenhouse gases. They fall into two groups: one specifies emissions reductions of 50% to 80% below 1990 levels by 2050; the other establishes a tightening target for emissions intensity and stipulates a time path for a "safety valve" limit on the emission price that approximately stabilizes U.S. emissions at the 2008 level. A set of three synthetic emissions paths are defined that span the range of stringency of these proposals, and these "core" cases are analyzed for their consequences in terms of emissions prices, effects on energy markets, welfare cost, the potential revenue generation if allowances are auctioned and the gains if permit revenue were used to reduce capital or labor taxes.

Initial period prices for the first group of proposals, in carbon dioxide equivalents, are estimated between $30 and $50 per ton CO2-e depending on where each falls in the 50% to 80% range, with these prices rising by a factor of four by 2050. Welfare costs are less than 0.5% at the start, rising in the most stringent case to near 2% in 2050. If allowances were auctioned these proposals could produce revenue between $100 billion and $500 billion per year depending on the case. Emissions prices for the second group, which result from the specified safety-valve path, rise from $7 to $40 over the study period, with welfare effects rising from near zero to approximately a 0.5% loss in 2050. Revenue in these proposals depends on how many allowances are freely distributed.

To analyze these proposals assumptions must be made about mitigation effort abroad, and simulations are provided to illuminate terms-of-trade effects that influence the emissions prices and welfare effects, and even the environmental effectiveness, of U.S. actions. Sensitivity tests also are provided of several of the design features imposed in the "core" scenarios including the role of banking, the specification of less than complete coverage of economic sectors, and the development of international permit trading. Also, the effects of alternative assumptions about nuclear power development are explored. Of particular importance in these simulations is the role of biofuels, and analysis is provided of the implications of these proposals for land use and agriculture.

Finally, the U.S. proposals, and the assumptions about effort elsewhere, are extended to 2100 to allow exploration of the potential role of these bills in the longer-term challenge of reducing climate change risk. Simulations using the MIT Integrated System Model show that the 50% to 80% targets are consistent with global goals of atmospheric stabilization at 450 to 550 ppmv CO2 but only if other nations, including the developing countries, follow.

Appendix D (added February 2008) [PDF: 416 kB]
Since this report was completed there has been an effort in the Senate to draft legislation that would unify support behind common legislation. One result is the Climate Security Act (S. 2191) sponsored by Senators Lieberman and Warner. In this appendix we provide an analysis of the Act's provisions as they relate to key features governing the cap-and-trade system, comparing results with the analysis in the body of the report. The analysis does not consider other features of the bill, such as the effects of how auction revenue is used, which could affect the overall cost estimates. Some of these other features are discussed, but not quantitatively analyzed, in Section D4. Also, as noted in the body of the report many uncertainties exist in projecting policy costs of an emissions constraint, including the rate of economic and emissions growth, the evolution of conditions abroad, the potential cost and availability of new technology, and different ways of interpreting the provisions of the legislation. The body of the report investigates the effects of varying some of these conditions, but we do not attempt in this short appendix to re-investigate the sensitivity of the results to key assumptions. Thus, the results presented here are based on one representation of the future conditions in a particular model.

Application of the MIT Emissions Prediction and Policy Analysis (EPPA) model to assessment of the future of coal under climate policy revealed the need for an improved representation of load dispatch in the representation of the electric sector. A new dispatching algorithm is described and the revised model is applied to an analysis of the future of coal use to 2050 and 2100 under alternative assumptions about CO2 prices, nuclear expansion and prices of natural gas. Particular attention is devoted to the potential role of coal-electric generation with CO2 capture and storage. An appendix provides a comparison of a subset of these results with and without the more detailed model of electric dispatch.

Although policymaking in response to the climate change is essentially a challenge of risk management, most studies of the relation of emissions targets to desired climate outcomes are either deterministic or subject to a limited representation of the underlying uncertainties. Monte Carlo simulation, applied to the MIT Integrated Global System Model (an integrated economic and earth system model of intermediate complexity), is used to analyze the uncertain outcomes that flow from a set of century-scale emissions targets developed originally for a study by the U.S. Climate Change Science Program. Results are shown for atmospheric concentrations, radiative forcing, sea ice cover and temperature change, along with estimates of the odds of achieving particular target levels, and for the global costs of the associated mitigation policy. Comparison with other studies of climate targets are presented as evidence of the value, in understanding the climate challenge, of more complete analysis of uncertainties in human emissions and climate system response.

Pages

Subscribe to Climate Policy