Renewables up to 90% by 2050 would cost less than current generation mix: NREL study

Share

Increasing renewable generation up to 90% by 2050 in the contiguous United States would yield lower system costs than maintaining the current generation mix, according to a research paper published in the journal Joule.

Those lower system costs do not count public health, environmental, or climate benefits from increased use of renewables, say the study’s eight co-authors. Seven of the authors are with the National Renewable Energy Laboratory (NREL).

Assuming no new federal or state policies beyond those in effect as of June 2020, the least-cost system would reach 57% renewables by 2050, with average levelized costs of $30/MWh, or 3 cents per kWh.

Moderately higher system costs of $36/MWh would result from a requirement to reach 90% renewables a decade earlier, in 2040. Reaching 100% renewables by 2050 would be possible with existing technologies, yielding costs of $39/MWh, the study said.

The results are largely driven by the use of updated cost projections for utility-scale solar from NREL’s 2020 Annual Technology Baseline. Those projections were well below the prior year’s projections. The “moderate” cost projections used in the study showed a cost of about $24/MWh for utility-scale solar in 2030, versus the prior year’s projection of about $38/MWh in 2030.

The study found that incremental costs of the system would rise steeply for the last few percentage points of renewable generation, as the system approaches 100% renewables.

To electrification and beyond

Stanford University energy researcher Mark Jacobson said in a LinkedIn comment that annual costs would be reduced even more compared to fossil fuels by “including electrification of transport, buildings and industry; eliminating energy in mining and refining fossil fuels and uranium; and accounting for demand response and heat/cold storage.” He cited a new Stanford study for each of the 50 states presenting those results.

Jacobson added that in 2009, when his team first published modeling of 100% renewables, “utilities claimed that 20% renewables on the grid was infeasible,” and that in 2015, “critics claimed that 80% was infeasible.”

With the new NREL study, he said, “their claims are now debunked.”

Modeling

The NREL study found that reaching high renewables levels sooner than 2050 would cost more for three reasons. First, because the study discounted costs at 5% per year, costs incurred earlier had a higher discounted cost. Second, capital costs for renewables were expected to continue declining through 2050, yielding lower costs for later deployment. Third, retiring fossil generators before they were fully paid for was modeled as “resulting in additional costs from stranded assets.”

To model its base case results, the study used mid-range assumptions for capital and operating costs of generators, and for electricity demand. The research team used NREL’s ReEDS model for capacity expansion analysis, and the commercial PLEXOS model for production cost modeling. Scenarios were run on NREL’s high performance computer.

To ensure resource adequacy, the study team modeled seasonal firm capacity requirements equivalent to the North American Electric Reliability Corporation reference reserve margin levels.

The study also considered 22 additional sets of assumptions, including low costs for renewables and batteries; a near-doubling of electricity demand due to electrification with 34% of total demand available as flexible resources; or no new transmission. These other sets of assumptions were modeled only with the ReEDS model.

This content is protected by copyright and may not be reused. If you want to cooperate with us and would like to reuse some of our content, please contact: editors@pv-magazine.com.