Documentation


For the technical concepts employed in MERGE, see the following publications by Alan Manne, Richard Richels and coauthors.  An asterisk denotes a technical report that has not been widely distributed.  A red highlight indicates a paper accessible from this website.


 

"Scenarios of Greenhouse Gas Emissions and Atmospheric Concentrations", U.S. Climate Change Science Program, July 2007.

This report presents research from Synthesis and Assessment Product 2.1a of the Climate Change Science Program (CCSP). The MERGE modeling team, along with two other participating groups, each independently developed a reference scenario, in which all climate policies were assumed to expire in 2012, and then developed four stbailization scenarios as departures from their respective reference scenarios. Idealized emissions-reduction measures - designed to reduce emissions wherever, whenever, and using whichever GHG was most cost effective - were imposed to limit GHG emissions and meet the four stabilization levels (specified in terms of radiative forcing).


 

"Managing the Transition to Climate Stabilization", Working Paper 07-01, AEI-Brookings Joint Center for Regulatory Studies, January 2007.

This paper builds upon the emissions scenarios report from the U.S. CCSP released in July 2007. In particular, the analysis is extended to consider a relaxation of the assumptions of full "when" and "where" flexibility as well as alternative technology scenarios.


 

* "The Role of Non-CO2 Greenhouse Gases and Carbon Sinks in Meeting Climate Objectives", July 2004.

In designing a strategy for greenhouse gas (GHG) abatement, perhaps the most important single component is energy-related CO2. This is where most of the analytic effort has been centered. Non-CO2 greenhouse gases and carbon sinks have received less attention, but are important enough to warrant consideration. This topic was the focus of the Energy Modeling Forum study 21 in which we participated. This paper describes our efforts to incorporate multiple GHGs and sinks into the MERGE model, and discusses some of the implications for mitigation cost assessments.


 

* "MERGE: An Integrated Assessment Model for Global Climate Change", June 2004.

MERGE is a model for estimating the regional and global effects of greenhouse gas reductions.  It quantifies alternative ways of thinking about climate change.  The model contains submodels governing:

the domestic and international economy

energy-related emissions of greenhouse gases

non-energy emissions of GHGs

global climate change – market and non-market damages

 


 

"Moving Beyond Concentrations: The Challenge of Limiting Temperature Change", Working Paper 04-11, AEI-Brookings Joint Center for Regulatory Studies, April 2004.

The UN Framework Convention on Climate Change shifted the attention of the policy community from stabilizing greenhouse gas emissions to stabilizing atmospheric greenhouse gas concentrations. While this represents a step forward, it does not go far enough. We find that, given the uncertainty in the climate system, focusing on atmospheric concentrations is likely to convey a false sense of precision. The causal chain between human activity and impacts is laden with uncertainty. From a benefit-cost perspective, it would be desirable to minimize the sum of mitigation costs and damages. Unfortunately, our ability to quantify and value impacts is limited. For the time being, we must rely on a surrogate. Focusing on temperature rather than on concentrations provides much more information on what constitutes an ample margin of safety. Concentrations mask too many uncertainties that are crucial for policy making.


 

* "Global Climate Change and the Equity-Efficiency Puzzle", December 2003.

There is a broad consensus that the costs of abatement of global climate change can be reduced efficiently through the assignment of quota rights, and through international trade in these rights. But there is no consensus on whether the initial assignment of emission permits can affect the Pareto-optimal global level of abatement.

This paper provides some insight into the equity-efficiency puzzle. Qualitative results are obtained from a small-scale model, and then quantitative evidence of separability is obtained from MERGE, a multi-region integrated assessment model. It is shown that if all the costs of climate change can be expressed in terms of GDP losses, Pareto-efficient abatement strategies are independent of the initial allocation of emission rights. This is the case sometimes described as “market damages”.

If, however, different regions assign different values to non-market damages such as species losses, different sharing rules may affect the Pareto-optimal level of greenhouse gas abatement. Separability may then be demonstrated only in specific cases (e.g. identical welfare functions or quasi-linearity of preferences or small shares of wealth devoted to abatement).



* "MERGE - Presentation to EMF 21", Stanford University, December 2003.

 


 

* "Market Exchange Rates or Purchasing Power Parity: Does the Choice Make a Difference to the Climate Debate?" , August 2003.

In the year 2000, the Intergovernmental Program on Climate Change (IPCC) published its Special Report on Emission Scenarios (SRES). The scenarios were defined by alternative assumptions concerning the demographic, economic, and technological driving forces, which, in large part, determine greenhouse gas (GHG) and sulfur emissions. The full set of scenarios produced a higher range of global mean temperature change over the 21st century than were contained in previous IPCC assessments. Recently, the validity of the SRES Scenarios has been questioned. Critics have expressed concern about the way that economic indicators, such as gross domestic product (GDP), are converted from domestic currencies into a common currency such as dollars. In short, they charge that the use of market exchange rates (MER), rather than purchasing power parity (PPP), has led to an upward bias in emission projections. This, in turn, has resulted in unrealistically high temperature projections. In this note, we estimate the differences in key climate-related variables that might result from choosing one approach over the other. Whereas the use of PPP for dealing with the volatility of exchange rates has been the subject of debate among economists and others for some time, we find that the choice of conversion factor makes only a small difference when projecting future temperature change.


A particular ceiling on atmospheric CO2 concentrations can be maintained through a variety of emission pathways.  Over the past decade, there has been considerable debate over the characteristics of a least-cost pathway.  Some have suggested that a gradual departure from the emissions baseline will be the most cost-effective because it reduces the pressure for premature retirement of the existing capital stock, and it provides valuable time to develop low-cost, low-carbon emitting substitutes. Others counter that a major flaw in analyses that support this line of reasoning is that they ignore learning-by-doing (LBD).

In this paper, we examine the impact of LBD on the timing and costs of emissions abatement.  With regard to timing, we find that including learning-by-doing does not significantly alter the conclusions of previous studies that treated technology cost as exogenous. The analysis supports the earlier conclusion that for a wide range of stabilization ceilings, a gradual transition away from the "no policy" emissions baseline is preferable to one that requires substantial near-term reductions. We find that the major impact of including learning-by-doing is on the costs of emission abatement. Depending upon the sensitivity of costs to cumulative experience, LBD can substantially reduce the overall costs of emissions abatement.



* "Learn-by-doing and Carbon Dioxide Abatement", March 2002.

There are inherent difficulties in solving LBD (learn-by-doing) models.  Basic to such models is the idea that the accumulation of experience leads to a lowering of costs.

This paper is intended to explore some of the algorithmic issues in LBD modeling for carbon dioxide abatement.  When using a standard algorithm for nonlinear programming, there is no guarantee that a local LBD optimum will also be a global optimum.  Fortunately, despite the absence of guarantees, there is a good chance that one of the standard algorithms will produce a global optimum for models of this type - particularly if there is an artful selection of the starting point or of the terminal conditions.   Moreover, there is a new procedure named BARON.   In the case of small models, a global optimum can be recognized and guaranteed through BARON.

Eventually, it should be possible for BARON or a similar approach to be extended to large-scale LBD models for climate change.  Meanwhile, in order to check for local optima, the most practical course may be to employ several different starting points and terminal conditions.



*  "US Rejection of the Kyoto Protocol: the impact on compliance costs and CO2 emissions ", September 2001.

Despite the US rejection of the Kyoto Protocol, the meeting of the parties to the UN Framework Convention on Climate Change in July 2001 has increased the likelihood that the Protocol will be ratified. Key findings include:

1. Participating OECD countries may experience a decline in mitigation costs, but because of the banking provision contained in the Protocol, the decline may not be as great as some would suggest.

2. If the majority of “hot air” is concentrated in a small number of countries in Eastern Europe and the former Soviet Union, these countries may be able to organize a sellers’ cartel and extract sizable economic rents; and

3. Even in the absence of mandatory emission reduction requirements, US emissions in 2010 may be lower than their business-as-usual baseline because of expectations regarding future regulatory requirements.



*  "Carbon Emissions and Petroleum Resource Assessments ", presentation at International Energy Workshop, IIASA, Laxenburg, Austria, June 19, 2001.

This paper demonstrates how oil and gas resource assumptions affect MERGE (a model for evaluating regional and global effects of greenhouse gas reduction policies).  Undiscovered resources are based upon the U.S. Geological Survey "World Petroleum Assessment 2000".

Guesstimated oil and gas supply curves with ten steps within each region. Instead of OPEC behavioral functions, there are maximum production/reserve ratios and maximum resource depletion factors.  To allow for resource depletion without a long-term rising price trend, the reference case includes annual cost reduction factors of 0.5% in each energy category. There are backstops for both electric and nonelectric energy.

Results are reported at a global level for: oil and gas production, fuel shares, carbon prices, oil prices and carbon emissions under alternative scenarios.  If no energy cost reductions are assumed, there are higher energy prices and lower carbon emissions than in the reference case.  If both oil and gas resources are low, there are demands for greater synthetic fuels production and higher carbon emissions.  If gas resources are unlimited, emissions are lower during the early years, but higher later on.  During the later periods, there is an incentive for electricity production based on gas rather than a carbon-free backstop.  This leads to higher carbon emissions than in the reference case.



"An Alternative Approach to Establishing Trade-offs among greenhouse gases", Nature, vol. 410 ( 2001 ), pp. 675-677.

The Kyoto Protocol permits countries to meet part of their emission reduction obligations by cutting back on gases other than CO2.  This approach requires a definition of trade-offs among the radiatively active gases.  The Intergovernmental Panel on Climate Change has suggested global warming potentials for this purpose, which use the accumulated radiative forcing of each gas by a set time horizon to establish emission equivalence.  But it has been suggested that this approach has serious shortcomings: damages or abatement costs are not considered and the choice of time horizon for calculating cumulative radiative force is critical, but arbitrary.  Here we describe an alternative framework for determining emission equivalence between radiatively active gases that addresses these weaknesses.  We focus on limiting temperature change and rate of temperature change, but our framework is also applicable to other objectives.  For a proposed ceiling, we calculate how much one should be willing to pay for emitting an additional unit of each gas.  The relative prices then determine the trade-off between gases at each point in time, taking into account economical as well as physical considerations.  Our analysis shows that the relative prices are sensitive to the lifetime of the gases, the choice of target and the proximity of the target, making short-lived gases more expensive to emit as we approach the prescribed ceiling.


"A Multi-Gas Approach to Climate Policy", April 2000.

This paper addresses four questions:  (1) What are the implications of a multi-gas approach when designing policies for reducing greenhouse gas (GHG) emissions?  (2) How sensitive is the optimal mix of mitigation options to the choice of global warming potentials (GWPs)?  (3) Are there alternative approaches, which provide a more logical justification for action?  (4) If so, what are their strengths and weaknesses?

We begin by adopting the 100-year GWPs recommended by the IPCC. Incorporating the two major non-CO2 greenhouse gases (CH4 and N2O) increases the size of the required reduction, but it also expands the portfolio of mitigation options.  For the Kyoto Protocol, we find that a multi-gas approach benefits all Annex B regions with the exception of the former Soviet Union and Eastern Europe.  It also turns out that the optimal mix of mitigation options is sensitive to the time horizon used to calculate the GWPs.

Given the lack of a rationale for choosing one set of GWPs over another, we examined two alternatives for establishing quantitative tradeoffs between gases.  The first was based on cost-effectiveness analysis (minimizing the costs of prescribed limits on temperature change); the second on benefit-cost analysis.  Both the cost-effectiveness and the benefit- cost perspective highlight the shortcomings of GWPs for establishing equivalence among gases. Not only do the relative prices vary over time, but also they are sensitive to the ultimate goal.

Ideally, the relative importance of the individual gases would be the product of an analysis which minimized the discounted present value of damages and mitigation costs.  Unfortunately, given the current state of knowledge regarding potential damages, such an approach may be premature.  If indeed this is the case, focusing on temperature change may have distinct advantages over GWPs.  It could serve as a useful temporary surrogate for benefit-cost analysis.


"Energy Technology Assessment in MERGE", March 2000.

MERGE includes a "bottom-up" analysis of the energy supply sector.  This paper provides a brief description of the electric and nonelectric technologies employed in the model.  There are year-by-year introduction constraints and supply limits for most of these technologies.  There are also decline constraints.  Oil, gas and coal are viewed as exhaustible resources.

The electricity sector includes an advanced carbon-free technology.  Early versions of this technology are high-cost, but they facilitate rapid introduction of later versions that are low-cost.  In this sense, the model allows for endogenous technology diffusion.  The paper includes a brief demonstration that this can make a difference in terms of greenhouse gas mitigation strategy.


"International Carbon Agreements, EIS Trade and Leakage", March 2000.

The MERGE model has been modified to include the possibility of leakage - changes in the location of production that are associated with international trade in EIS (the energy-intensive sectors).  The new feature is introduced in a way that preserves the basic simplifying characteristics of MERGE.  The intercept of the non-energy supply curve for EIS in each region may be described as a Heckscher-Ohlin fraction.

The intercept of these supply curves serves the same purpose as an Armington elasticity describing substitution between foreign and domestic goods.  This is the way in which we avoid penny-switching as a characteristic solution mode.  At the same time, this avoid the usual difficulties with the Armington formulation when base-year trade quantities are small.

In qualitative terms, our conclusions are as follows:

(1) Leakage is a distinct possibility, but it would not be large enough to wipe out the global effects of the carbon reductions to be undertaken by the Annex B countries under the Kyoto Protocol.

(2) Leakage ratios are large enough to be worrisome.

(3) Leakage ratios are sensitive to the numerical values of the EIS trade parameters.


"The Kyoto Protocol:  A Cost-Effective Strategy for Meeting Environmental Objectives?", The Energy Journal, special issue, May 1999, pp. 1-23.

The Kyoto Protocol represents a milestone in climate policy.  For the first time, negotiators have attempted to lay out emission reduction targets for the early part of the 21st century.  Mitigation costs are only one of the considerations, but policy makers are keenly interested in the economic implications of ratification.

This paper is intended to help clarify our understanding of compliance costs.  The focus is on three questions, which we believe to be of particular relevance:  What are the near-term costs of implementation?  How significant are the "flexibility" provisions?  And, perhaps most important, is the Protocol cost-effective in the context of the long-term goals of the Framework Convention?


"Equity, Efficiency and Discounting", ch. 12 in P.R. Portney and J.P. Weyant, Discounting and Intergenerational Equity, Resources for the Future, Washington, D.C., 1999.

In the integrated assessment of global climate change, it has been typical to employ the ILA (infinite-lived agent) approach.  As a result, there have been extensive controversies over the rate of discount to be applied to the costs and benefits of emissions abatement.  This paper explores the logic of an alternative formulation based on the OLG (overlapping generations) paradigm.

It is not essential to assume intergenerational altruism.  All that is required is the recognition that global abatement represents a specific form of environmental capital accumulation, and that there be appropriate markets for realizing the distant-future benefits from this type of activity.


"On Stabilizing CO2 Concentrations - Cost-Effective Emission Reduction Strategies", Environmental Modeling and Assessment, 2 (1997) 251-265.

With the adoption of the Berlin Mandate, developed countries are being asked to set emission limits for the early decades of the next century.  The size of the reductions is currently the subject of international negotiations.

This paper is intended to contribute to the analysis and assessment phase leading up to the adoption of new targets and timetables.  However, we take a somewhat different approach than that suggested by the Berlin Mandate.  Rather than focus exclusively on the next steps by developed countries, we review the issue from the perspective of the Convention's ultimate objective, the stabilization of atmospheric concentrations.  We examine what might constitute cost-effective strategies for limiting CO2 concentrations to alternative levels.  We then explore the implications for near-term mitigation decisions and for long-term participation by the developing countries.


"Greenhouse Gas Abatement - toward Pareto-Optimal Decisions under Uncertainty", Annals of Operations Research, Baltzer Science Publishers, 68 (1996) 267-279. (with T.R. Olsen).

Provides a simplified version of the ATL (act, then learn) decision analysis methodology employed in MERGE.


 "Greenhouse Gas Abatement - toward Pareto-Optimality in Integrated Assessments", ch.  26 in Education in a Research University, edited by K.J. Arrow, R.W. Cottle, B.C. Eaves and I. Olkin, Stanford University Press, Stanford, CA, 1996.

Describes the conditions under which it is possible to separate international equity from efficiency considerations.  Typical MERGE runs are based on the assumption that there is efficient multilateral joint implementation.  This could, for example, be achieved through tradable quota rights with markets for present and future tradable emission permits.


"The Greenhouse Debate:  Economic Efficiency, Burden Sharing and Hedging Strategies", The Energy Journal, vol. 16, no. 4, pp. 1-37, 1995.

Describes MERGE 2.  The model provides for two-way linkage between ETA-MACRO and the CLIMATE and IMPACT submodels.  It includes an application of decision analysis, and it contrasts the "learn, then act" versus "act, then learn" approach.

This paper is probably the best single place currently available for reading about the policy implications as well as about the technical aspects of MERGE.


"MERGE - A Model for Evaluating Regional and Global Effects of GHG Reduction Policies", Energy Policy, vol. 23, no. 1, pp.  17-34, 1995. (with R. Mendelson).

Global 2100 was a cost-effectiveness analysis, but MERGE is an integrated assessment in which the costs of abatement are explicitly balanced off against the benefits of reducing the impacts of climate change.  In this first version of MERGE, there is one-way linkage from ETA-MACRO to the CLIMATE and IMPACT submodels.  MERGE accounts for market damages (through production losses) and nonmarket damages (through losses in utility).


"International Trade in Oil, Gas and Carbon Emission Rights: An Intertemporal General Equilibrium Model", The Energy Journal, 15:1, 1994 (with T.F. Rutherford).


"International Trade, Capital Flows and Sectoral Analysis:  Formulation and Solution of Intertemporal Equilibrium Models", pp. 191-205 in W.W. Cooper and A.B. Whinston (eds), New Directions in Computational Economics, Kluwer Academic Publishers, Netherlands, 1994 (with T.F. Rutherford).


"Buying Greenhouse Insurance - the Economic Costs of Carbon Dioxide Emission Limits", MIT Press book, Cambridge, MA, 1992.

Chapter 7 describes the Global 2100 model for determining the costs of adapting to one or another limit on carbon emissions.  The model provides for two-way linkage between a top-down model of economic growth and energy demands (MACRO) and a bottom-up model for energy technology assessment (ETA). A Ramsey model is employed for the determination of savings and investment through a discounted utility maximand.  Energy-economy interactions, price-induced energy conservation and interfuel substitution are handled through a nested CES (constant-elasticity of substitution) production function. The model also allows for autonomous energy efficiency improvements (AEEI, for short).

Appendix 7B shows how to formulate the model so as to allow for decisions under uncertainty.  It contrasts the "learn, then act" (scenario) approach versus "act, then learn" (hedging strategies).

In Global 2100, the world is disaggregated into five geopolitical regions, and parallel analyses are conducted for each region independently.  An informal decomposition procedure is employed for interregional trade in oil and in carbon emission rights.  The two following papers describe the formal procedure that has superseded this method for interregional trade linkages.  These linkages are handled through Rutherford's procedure for solving applied general equilibrium models - joint maximization through the iterative determination of Negishi weights.  These two papers also explain why MERGE is benchmarked in such a way as to equate the marginal productivity of capital between all regions.