Since incidents at Three Mile Island, Chernobyl, and Fukushima, many countries have switched from nuclear power to electricity production fired by fossil fuels, despite the environmental consequences of burning fuels such as coal. A new study used data from the United States to analyze the costs and benefits of electricity production from coal-fired versus nuclear sources. The study’s authors conclude that policymakers should look at nuclear power as a low-carbon electricity source, but that utilities will need to have incentives to do so.
The study, by researchers at Carnegie Mellon University (CMU) and the IZA Institute of Labor Economics, appears in Resource and Energy Economics.
“By calculating the economic and environmental costs associated with producing electricity using coal-fired power plants rather than nuclear sources, our study informs the ongoing policy debate about whether to subsidize existing nuclear power generation,” explains Akshaya Jha, assistant professor of economics and public policy at CMU’s Heinz College, who coauthored the study.
Researchers used monthly operations data from the Energy Information Administration on nearly every power plant in the United States from 1970 to 2014 to estimate the extent to which the buildout of nuclear power replaced fossil fuel-fired electricity generation. They also estimated the extent to which fossil-fuel power generation increased during unplanned nuclear outages from 1999 to 2014, and they explored why a declining share of U.S. electricity generation came from nuclear sources despite the fact that using conventional fossil fuels resulted in significant increases in air pollution.
Read more at Carnegie Mellon University