The What and Why of Carbon Budgets

If you’ve been paying much attention to the climate policy discussion over the last few years, you’ve probably heard mention of carbon budgets, or greenhouse gas (GHG) emissions budgets more generally. Put simply, for any given temperature target there’s a corresponding total cumulative amount of greenhouse gasses that can be released, while still having a decent chance of meeting the target. For example, the IPCC estimates that if we want a 2/3 chance of keeping warming to less than 2°C, then we can release no more than 1000Gt of CO2 between 2011 and the end of the 21st century.

The IPCC estimates that if we want a 2/3 chance of limiting warming to less than 2°C, then we can release no more than 1000Gt of CO2 equivalent between 2011 and the end of the 21st century.

The reason the IPCC and many other scientist types use carbon budgets instead of emissions rates to describe our situation is that the atmosphere’s long-term response to GHGs is almost entirely determined by our total cumulative emissions. In fact, as the figure below from the IPCC AR5 Summary for Policymakers shows, our current understanding suggests a close to linear relationship between CO2 released, and ultimate warming… barring any wild feedbacks (which become more likely and frightening at high levels of atmospheric CO2) like climate change induced fires vaporizing our boreal and tropical forests.

Carbon Budget vs. Cumulative Warming
Figure SPM.5(b), from the IPCC AR5 Summary for Policymakers.

What matters from the climate’s point of view isn’t when we release the GHGs or how quickly we release them, it’s the total amount we release — at least if we’re talking about normal human planning timescales of less than a couple of centuries. This is because the rate at which we’re putting these gasses into the atmosphere is much, much faster than they can be removed by natural processes — CO2 stays in the atmosphere for a long time, more than a century on average.    We’re throwing it up much faster than nature can draw it down.  This is why the concentration of atmospheric CO2 has been marching ever upward for the last couple of hundred years, finally surpassing 400ppm this year.

So regardless of whether we use the entire 1000Gt budget in 20 years or 200, the ultimate results in terms of warming will be similar — they’ll just take less or more time to manifest themselves.

Unfortunately, most actual climate policy doesn’t reflect this reality.  Instead, we tend to make long term aspirational commitments to large emissions reductions, with much less specificity about what happens in the short to medium term.  (E.g. Boulder, CO: 80% by 2030, Fort Collins, CO: 80% by 2030, the European Union: 40% by 2030).  When we acknowledge that it’s the total cumulative emissions over the next couple of centuries that determines our ultimate climate outcome, what we do in the short to medium term — a period of very, very high emissions — becomes critical.  These are big years, and they’re racing by.

Is 1000Gt a Lot, or a Little?

Few normal people have a good sense of the scale of our energy systems. One thousand gigatons. A thousand billion tons. A trillion tons. Those are all the same amount. They all sound big. But our civilization is also big, and comparing one gigantic number to another doesn’t give many people who aren’t scientists a good feel for what the heck is going on.

Many people were first introduced to the idea of carbon budgets through Bill McKibben’s popular article in Rolling Stone: Global Warming’s Terrifying New Math. McKibben looked at carbon budgets in the context of the fossil fuel producers. He pointed out that the world’s fossil fuel companies currently own and control several times more carbon than is required to destabilize the climate. This means that success on climate necessarily also means financial failure for much of the fossil fuel industry, as the value of their businesses is largely vested in the control of carbon intensive resources.

If you’re familiar with McKibben’s Rolling Stone piece, you may have noticed that the current IPCC budget of 1000Gt is substantially larger than the 565Gt one McKibben cites. In part, that’s because these two budgets have different probabilities of success. 565Gt in 2012 gave an 80% chance of keeping warming to less than 2°C, while the 2014 IPCC budget of 1000Gt would be expected to yield less than 2°C warming only 66% of the time. The IPCC doesn’t even report a budget for an 80% chance. The longer we have delayed action on climate, the more flexible we have become with our notion of success.

Unfortunately this particular brand of flexibility, in addition to being a bit dark, doesn’t even buy us very much time. If we continue the 2% annual rate of emissions growth the world has seen over the last couple of decades, the difference between a budget with a 66% chance of success and a 50% chance of success is only ~3 years worth of emissions. Between 50% and 33% it’s only about another 2 years. This is well-illustrated by some graphics from Shrink That Footprint (though they use gigatons of carbon or GtC, instead of CO2 as their unit of choice, so the budget numbers are different, but the time frames and probabilities are the same):

Carbon-budget1

Like McKibben’s article, this projection is from about 3 years ago. In those 3 years, humanity released about 100Gt of CO2. So, using the same assumptions that went into the 565Gt budget, we would now have only about 465Gt left — enough to take us out to roughly 2030 at the current burn rate.

There are various other tweaks that can be made with the budgets in addition to the desired probability of success, outlined here by the Carbon Tracker Initiative.  These details are important, but they don’t change the big picture: continuing the last few decades trend in emissions growth will fully commit us to more than 2°C of warming by the 2030s. 2030 might sound like The Future, but it’s not so far away.  It’s about as far in the future as 9/11 is in the past.

It’s encouraging to hear that global CO2 emissions remained the same in 2014 as they were in 2013, despite the fact that the global economy kept growing, but even if that does end up being due to some kind of structural decoupling between emissions, energy, and our economy (rather than, say, China having a bad economic year), keeping emissions constant as we go forward is still far from a path to success. Holding emissions constant only stretches our fixed 1000Gt budget into the 2040s, rather than the 2030s.

If we’d started reducing global emissions at 3.5% per year in 2011… we would have had a 50/50 chance of staying below 2°C by the end of the 21st century. If we wait until 2020 to peak global emissions, then the same 50/50 chance of success requires a 6% annual rate of decline.  That’s something we’ve not yet seen in any developed economy, short of a major economic dislocation, like the collapse of the Soviet Union.  And unlike that collapse, which was a fairly transient event, we will need these reductions to continue year after year for decades.

Growth-rates2

The Years of Living Dangerously

We live in a special time for the 2°C target.  We are in a transition period, that started in about 2010 and barring drastic change, will end around 2030.  In 2010, the 2°C target was clearly physically possible, but the continuation of our current behavior and recent trends will render it physically unattainable within 15 years.  Barring drastic change, over the course of these 20 or so years, our probability of success will steadily decline, and the speed of change required to succeed will steadily increase.

I’m not saying “We have until 2030 to fix the problem.”  What I’m saying is closer to “We need to be done fixing the problem by 2030.”  The choice of the 2°C goal is political, but the physics of attaining it is not.

My next post looks at carbon budgets at a much smaller scale — the city or the individual — since global numbers are too big and overwhelming for most of us to grasp in a personal, visceral way.  How much carbon do you get to release over your lifetime if we’re to stay with in the 1000Gt budget?  How much do you release today?  What does it go toward?  Flying? Driving? Electricity? Food?  How much do these things vary across different cities?

Featured image courtesy of user quakquak via Flickr, used under a Creative Commons Attribution License.

Decoupling & Demand Side Management in Colorado

Utility revenue decoupling is often seen as an enabling policy supporting “demand side management” (DSM) programs.  DSM is a catch-all term for the things you can do behind the meter that reduce the amount of energy (kWh) a utility needs to produce or the amount of capacity (kW) it needs to have available.  DSM includes investments improving the energy efficiency of buildings and their heating and cooling systems, lighting, and appliances.  It can also include “demand response” (DR) which is a dispatchable decline in energy consumption — like the ability of a utility to ask every Walmart in New England to turn down their lights or air conditioning at the same time on a moment’s notice — in order to avoid needing to build seldom used peaking power plants.

For reasons that will be obvious if you’ve read our previous posts on revenue decoupling, getting utilities to invest in these kinds of measures can be challenging, so long as their revenues are directly tied to the amount of electricity they sell.  Revenue decoupling can fix that problem.  However, reducing customer demand for energy on a larger scale, especially during times of peak demand, can seriously detract from the utility’s ability to deploy capital (on which they earn a return) for the construction of additional generating capacity.  That conflict of interests is harder to address.

But it’s worth working on, because as we’ll see below, DSM is cheap and very low risk — it’s great for rate payers, and it’s great for the economy as a whole.  It can reduce our economic sensitivity to volatile fuel prices, and often shifts investment away from low-value environmentally damaging commodities like natural gas and coal, toward skilled labor and high performance building systems and industrial components.

The rest of this post is based on the testimony that Clean Energy Action prepared for Xcel Energy’s 14AL-0660E rate case proceeding, before revenue decoupling was split off.  Much of it applies specifically to Xcel in Colorado.  However, the overall issues addressed are applicable in many traditional regulated, vertically integrated monopoly utility settings.

Why can’t we scale up DSM?

There are several barriers to Xcel profitably and cost-effectively scaling up their current DSM programs.  Removing these impediments is necessary if DSM is to realize its full potential for reducing GHG emissions from Colorado’s electricity sector.  Revenue decoupling can address some, but not all of them.

  1. There are the lost revenues from energy saved, which impacts the utility’s fixed cost recovery.  If the incentive payment that they earn by meeting DSM targets is too small to compensate for those lost revenues, then the net financial impact of investing in DSM is still negative — i.e. the utility will see investing in DSM as a losing proposition.  Xcel currently gets a “disincentive offset” to make up for lost revenues, but they say that this doesn’t entirely offset their lost revenues.
  2. Even if the performance incentive is big enough to make DSM an attractive investment, the PUC currently caps the incentive at $30M per year (including the $5M “disincentive offset”), meaning that even if there’s a larger pool of cost-effective energy efficiency measures to invest in, the utility has no reason to go above and beyond and save more energy once they’ve maxed out the incentive.
  3. If this cap were removed, the utility would still have a finite approved DSM budget.  With an unlimited performance incentive and a finite DSM budget, the utility would have an incentive to buy as much efficiency as possible, within their approved budget, which would encourage cost-effectiveness, but wouldn’t necessarily mean all the available cost-effective DSM was being acquired.
  4. Given that the utility has an annual obligation under the current DSM legislation to save a particular amount of energy (400 GWh), they have an incentive to “bank” some opportunities, and save them for later, lest they make it more difficult for themselves to satisfy their regulatory mandate in later years by buying all the easy stuff up front.
  5. It is of course the possible that beyond a certain point there simply aren’t any more scalable, cost-effective efficiency investments to be made.
  6. Finally and most seriously, declining electricity demand would pose a threat to the “used and useful” status of existing generation assets and to the utility’s future capital investment program, which is how they make basically all of their money right now.

Revenue decoupling can play an important role in overcoming some, but not all, of these limitations.  With decoupling in place, we’d expect that the utility would be willing and able to earn the entire $30M performance incentive (which they have yet to do in any year) so long as it didn’t make regulatory compliance in future years more challenging by prematurely exhausting some of the easy DSM opportunities.

Continue reading Decoupling & Demand Side Management in Colorado

Rêve: Dreaming of a Human City

ReveTitleA couple of weeks ago a large development dubbed Rêve (“dream” in French) became the first project to get called up by Boulder’s City Council at concept plan review (see the concept book for the project here).  Rêve would occupy a 6.7 acre site on the southeast corner of Pearl Parkway and 30th St., just to the west of the Solana apartments.  Much of it would extend south beyond the boundaries of the Boulder Junction area.  I offered some comments to City Council on the project, as someone who would like to see more human scale, rather than auto-oriented development in Boulder.  If we’re going to be able to do that anywhere, it seems like it ought to be Boulder Junction (formerly the Transit Village).  Once we get the BRT up and running, it should be highly transit accessible.  It’s surrounded by regional employment centers — the expanding east CU campus to the south, the new Googleplex to the east, and who knows what else eventually as the area builds out… or rather, builds in.  Also, despite being part of “east” Boulder, Boulder Junction is really quite centrally located within the city as a whole.  As I wrote recently both here and in the Daily Camera, I think that if it’s done with a particular focus on the human scale, and with less accommodation than we’re used to for automobiles, development in the area need not have substantial direct impacts on existing residential neighborhoods in the city, in terms of parking spillover, traffic congestion, and viewsheds.

I’m not opposed to the overall intensity of the development. In fact, I think it could be much better for people on the ground with a higher FAR.  Improving the project at the current or higher intensity hinges on doing a better job of curating and cultivating the spaces between the buildings, turning them into great outdoor rooms and corridors, and wholeheartedly turning them over to human beings.  This is just a matter of focusing on traditional (like, thousands of years old) urban design.

Continue reading Rêve: Dreaming of a Human City

Murder Machines

Murder Machines: Why Cars Will Kill 30,000 Americans This Year. A good essay-length look at how social norms regarding streets and safety have changed over the last century, and why our current norms and design guidelines lead very predictably to tends of thousands of preventable deaths each year.  Covers a lot of the same territory as Peter D. Norton’s excellent book Fighting Traffic, which gives a detailed historical account of the transition, between about 1915 and 1930, from streets being universally accessible public space to being nearly the sole domain of motorized transportation.  Ralph Nader effectively spearheaded a campaign for safety measures that protect those inside these deadly vehicles.  We need just as powerful a champion for those outside them, who make up about a third of all motor vehicle casualties in the US.  Streets don’t have to be designed to kill people.  Giving up a little bit of convenience for motorists frees up a lot of space and safety for everyone else.

Cities Without Traffic

Vintage Congestion
Ditch the vintage 1962 vision of Autopia. Cars are not people. The freedom to drive everywhere is “not quality of life”. We can have cities without traffic.

It’s an underlying axiom, a chanted mantra, a litany:

More people means more cars.
More cars means more traffic.
More traffic means more congestion.
We hate congestion, ergo:
NO MORE PEOPLE.

The litany was recently recited by John D. English in his Daily Camera guest opinion, imploring Boulder to “preserve our quality of life” by protecting the right of motorists to drive in the city without encountering traffic congestion.  But cars are not inextricably linked to people, and the freedom to drive everywhere is not quality of life.  Equating these things stalls infill development in the name of auto dependence, and keeps half the city trapped in late-20th century office park purgatory.  It preserves not quality of life, but underused asphalt oceans, impenetrable superblocks, and sad bike lanes painted on the side of roads that might as well be freeways.

The assumption that more people must inevitably mean more cars means different things to different people.  To the member of traditional Motordom with an interest in infill development, it means we need to build more regional road capacity (induced demand be damned!).  To auto-dependent neighborhood activists who cannot stomach the thought of Change in Our Fair Town, it means infill is unacceptable.

We can have more people, fewer cars, and less driving.  Other cities have already done it, and we’ve implicitly stated it as a goal in our Transportation Master Plan (TMP) and Climate Commitment.  The key to success is dramatically revising Boulder’s parking policies, and creating great streets for people.

Continue reading Cities Without Traffic

As The Future Melts Away

I’ve always been a sucker for a good time lapse.  This one strikes me as a time lapse within a time lapse.  It’s half a day, compressed into less than 5 minutes, with people flitting around like moths, posing for pictures with an ice sculpture of the future.  Only the time lapse eyes of the camera can see what’s happening.  And by the end the passers by probably can’t even tell what the message might have been.  But the art is a piece of time lapse too.  A century or a millennium compressed into a day of melting.  Even that is a stretch for our attention span.  Even the 5 minute video seems long and slow.  How can we create a society with a more meditative mindset?  With an attention span that reflects the extent of our impacts in deep time?

 

Less Than Revolutionary Finance

I’ve gotten some good natured pushback on the idea of buying oneself out of corporate servitude.  The objection seems to come in two general forms.

  1. Contingency of Financial Autonomy: Deriving financial autonomy from investments in corporations whose operations are fundamentally destructive creates a morally corrosive dependency — your interests end up being aligned with theirs, because your autonomy depends on them remaining profitable.
  2. Opportunity Costs: Even if investing in corporations doesn’t actually give them financial support, there’s an opportunity cost: the same money could be used to invest in small local businesses or social enterprises.  Wouldn’t that be more powerful and potentially transformational?

Continue reading Less Than Revolutionary Finance

Buy Yourself Out

Advertising has us chasing cars and clothes, working jobs we hate so we can buy shit we don’t need, to impress people we don’t like.
— Tyler Durden (Fight Club)

A couple of weeks ago I ran a workshop on retirement investing for some other co-op folks.  I’ve run this workshop before, but lately I’ve been thinking about it differently.  Turns out calling it “retirement” investing can be a turn-off when you’re talking to a bunch of mission driven people who are working on things they love, and think they’ll never want to “retire.” The word can have a connotation of hedonism or idleness.  The permanent worthless vacation.  Or just sitting around waiting to die.  “Early retirement” serves no purpose when the work you do is done primarily because you believe in it.  There’s also a sense with “retirement investing” that you can’t touch the money until you’re old.  Which is a long-ass time if you’re in your early twenties.

So I’ve started thinking about it as “autonomy investing” instead — becoming financially autonomous quickly, so that you can do the work you’re compelled to do. Without having to worry about whether your political activism will put your job at risk.  Without caring if your mission is compatible with the Nonprofit Industrial Complex and their funding metrics.  Without having to work a soul-sucking day job that leaves you too fried to spend your evenings and weekends on civic engagement and organizing. Or alternatively… without having to beg investors to pay your living expenses while you work on the early stages of your startup idea.

This is, essentially, the project of buying yourself out of corporate servitude.

Continue reading Buy Yourself Out

Decoupling and Distributed Energy

One of the main reasons utilities fight distributed generation like rooftop solar is that it erodes demand for their centrally generated electricity. Reduced demand is annoying for any business, but it’s especially bad for traditional monopoly utilities. It’s especially bad because much — even most — of the cost of producing a kWh of electricity doesn’t go away if you don’t produce that kWh of electricity. These so-called “fixed” or “non-production” costs come from multi-decade financial commitments to big pieces of infrastructure — the power plants, transmission lines, and distribution systems.

So when you put solar panels on your roof and reduce the amount of electricity you need to buy from the utility, there’s a little bit of fuel that doesn’t get burned, and a little bit of money saved on the utility side (but as we’ve pointed out before, they don’t actually benefit from that cost savings), but a lot of the money that the utility spent to be able to provide you with electricity if you needed it is already spent. This is problematic because most electricity rates are designed to recover utility costs in proportion to the amount of electricity you buy (this type of rate is known as a “volumetric rate”). So utilities have an incentive (known as the throughput incentive) to ensure that their electricity sales increase, or at the very least don’t decline.

If lots of people start buying much less electricity, this reduces utility spending on things like fuel, but it doesn’t have any effect (in the short term) on the fixed or non-production costs. To stay solvent, the utilities then go back to their regulators and say “Hey, we’re not getting enough revenue to cover our costs. Give us a rate hike!” and if the regulators agree, allowing the utilities to recover the same fixed costs from fewer overall kWh of electricity sold, this just makes it even more financially sensible for people to put solar panels on their roof, to avoid buying the more expensive electricity.  (And in our fantasy world, one could also imagine savvy regulators taking measures to decrease fixed costs, by forcing early retirement of risky, uneconomic fossil generation…)

This is the essence of the Utility Death Spiral that’s gotten so much attention over the last year or two (including a speakeasy we hosted), and which Dave Roberts did a great job of exploring in his Utilities for Dummies series over at Grist. From the Utility’s point of view the Death Spiral can be short-circuited with revenue decoupling… up to a point. With decoupling, they don’t have to go to regulators and ask for a rate hike — they can recover the fixed costs in a formulaic way, and so decoupled utilities are able to invest in energy efficiency without worrying about lost revenues.  They’re also likely to be less opposed to modest amounts of distributed generation.

In fact, it’s hard to imagine a climate-aware utility of the future that isn’t decoupled.  We need to get away from utilities treating electricity (and energy more generally) as a commodity, with profits tied to the quantity of product they sell.  Instead, we need to move toward treating energy as a service — Amory Lovins’ famous hot showers and cold beer — with an incentive to provide high quality service using the least possible amount of underlying energy.

Decoupling is a Good Thing™

However, if you care about climate, then you always have to ask not just Is this a good thing? but Is this good enough?  It’s an old cliché that “better is the enemy of good enough,” — i.e. spending time and money and effort on improvement beyond what’s good enough can be wasteful.  But in the context of climate, we have the opposite problem.  Moving things in the right direction can still mean abject failure.  Plenty of things that are better than the status quo — like decoupling utility revenues, or burning natural gas instead of coal —  come nowhere close to being good enough to keep us from seeing more than 2°C of warming.

To have a chance of stabilizing the climate, the utility business model can’t just be tinkered with.  It needs to be radically transformed.  The good news is that radical transformation is probably on the table whether the utilities want to talk about it or not.  Our task is to make it happen as quickly and smoothly as possible.

Courtesy of Gigasolar on Flickr.
Courtesy of Gigasolar on Flickr.

Utility Death Spiral: Not Just for the Paranoid

Until very recently anybody afraid of the death spiral dynamic might have seemed a little paranoid. DG was still pretty expensive, and often dependent on utility rebate programs, tax credits, and other incentives that were often controlled by regulators and utilities.  As the price of distributed solar has fallen, rebates have dwindled to nothing, and new financing mechanisms and business models have emerged. Utilities and regulators have lost some of their ability to moderate deployment, and they’re poised to lose much more.

A few examples of new DG financing and business arrangements compiled by Green Tech Media and others:

  • Mosaic has created a peer-to-peer lending platform that lets individuals invest in diversified portfolios of smaller distributed solar projects, earning around a 5% return on their investments. They’ve done about $10M worth of financing this way. Now they’re getting into solar loans with backing from a large international re-insurer, adding another $100M in capital.
  • Sungage just raised $100M in funding from a large northeastern US credit union to use as a revolving solar loan fund.
  • SolarCity has started issuing solar bonds with a similar yield directly to the public on a much larger scale. They’ve raised more than $100M so far, without going through the traditional finance industry.
  • Big time sprawling suburban home builder Lennar is now installing rooftop PV systems by default in some markets, including around Denver. They’re offering home buyers a power purchase agreement (PPA) in which they get a 20% discount off of retail electricity rates for 20 years.

From the consumer’s point of view what this means is that in an increasing number of markets, rooftop solar can now be had at a discount to utility power, with no up front costs. This is new and different and scary for utilities, because it means rooftop solar can go big. Fast.  Additionally, Elon Musk (who heads both electric car maker Tesla Motors and SolarCity…) is investing $5 billion (with a B) in a massive lithium ion battery factory in Nevada, hoping to drive costs down through economies of scale.

Suddenly, a good chunk of the traditional utility customer base starts to look a little sketchy.

Frozen Meters

Net Metering Required (For Now)

Many of these disruptive businesses depend on net metering policies and so utilities, including Xcel, have coordinated with the climate-denying corporate octopus that is the American Legislative Exchange Council to try and repeal it. So far net metering has been pretty durable. The policy is easy to understand and seems fair to most of the public, so it’s popular. Net metering also now has its own relatively well funded corporate advocates in the form of Big Solar — the very same companies raising hundreds of millions of dollars, listed above, being represented by The Alliance for Solar Choice (TASC) — one of the intervenors in 14AL-0660E (which is the PUC’s catchy name for this whole rate case thing we’ve been involved in).

In Colorado (and elsewhere) these dynamics have brought us to a regulatory stalemate. For once the status quo — net metering — favors distributed renewable electricity. It’s the policy that Big Solar has bet the farm on. But if we try and use it to scale up cheap rooftop PV dramatically, it may destabilize the utilities.

Straight net metering also won’t result in a particularly optimal deployment of distributed energy resources, because all it accounts for is energy production, and there are many more subtle qualities that are important to a well functioning electricity grid. If we can integrate those other qualities — temporal, geographic, environmental, price stabilization, etc. — into our electricity pricing we’ll get a much better overall outcome. As the Rocky Mountain Institute has put it: the debate over net metering misses the point.

Be that as it may, right now there are two 800lb gorillas (or maybe, an 800lb gorilla and a 300lb gorilla) locked in mortal combat — the utilities on one side and Big Solar on the other. One side is trying to get rid of net metering altogether, and the other is willing to fight to the death to preserve it. When people bring up other ways of valuing distributed renewable energy like Minnesota’s proposed Value of Solar or Feed in Tariffs they tend to either be ignored or attacked, sometimes by both sides of the fight! For example, The Alliance for Solar Choice wasted no time in setting up a campaign to stop what they glibly re-termed Feed in Taxes and Value of Solar Taxes as soon as Minnesota made it clear they were considering Value of Solar seriously.

Headed for Strange Country

As with so many aspects of climate and energy policy, change here is inevitable. Regardless of which side prevails in the fight over net metering, as the cost of distributed solar and energy storage continue to decline, we are headed for strange territory.

If the utilities prevail and repeal net metering, they’ll probably slow the spread of distributed generation, since customers would only be able to benefit economically from satisfying their electricity demand on-site in real time, rather than banking electricity production annually. But in the longer term, given ongoing PV system cost declines and the potential for cost-effective electricity storage, the utilities will still face a decline in electricity demand regardless of whether a policy like NEM remains in place. At one extreme we could end up in a situation (well described by RMI), where defection from the grid is economically sensible for a significant number of people.

On the other hand if Big Solar prevails then we get to the same place, maybe a little quicker, since they’re already operating with a net metering based business model at significant scale. If the Feds don’t renew the Investment Tax Credit in 2016 that will push the economics out a little, but there’s little reason to think the overall price trend is going to reverse. Ever.

Does that sound ridiculous? Then note that PV in 2014 is already 59% cheaper than NREL predicted it would be back in 2010, and Deutsche Bank is forecasting that solar will reach grid parity nationwide by the end of 2016. On the wholesale side the New York Times reports that without subsidies wind on the high plains has come in as low as ¢3.7/kWh (the same as just the production costs of Xcel’s Colorado fossil fleet in 2013).

Some folks think widespread grid defection sounds like utopian energy independence. In practice it would be far less equitable, more expensive, and operationally much less robust than a well designed network that integrates a lot of distributed energy. It’s also physically impossible in cities, which consume most of our electricity, because no matter how cheap solar and storage become, cities use more energy within their boundaries than is available from renewable sources in those same boundaries.  This is despite the fact that cities have  much lower per capita energy use than rural and suburban places of comparable wealth. Cities are great for the climate, but they will always need to import energy, and that means we will still need transmission and distribution systems.

Um, okay. But, decoupling?

In the near term, revenue decoupling would insulate Xcel against the sales they’re going to lose to rooftop solar and other distributed energy. Rather than seeing revenues decline as more electricity sales are displaced, they’d be empowered to adjust rates in a formulaic way to compensate for the losses, and ensure that the fixed costs of the grid continue to be paid for (along with their profits). In theory, this ought to remove or at least reduce their opposition to net metering.

In the long term, if grid defection becomes attractive, additional fixed-cost recovery mechanisms like revenue decoupling aren’t going to be much help to the utility.

Our task is to open up the discussion about creating an intelligent grid with electricity prices that reflect the more subtle attributes of distributed generation. Revenue decoupling is one potential avenue into that discussion — at least the early part of it.  How so?

In the short term, the utilities are fighting for the status quo, minus net metering, and they seem to be losing.  If the only two positions available are the status quo with vs. without net metering, the choice for renewable energy and climate advocates is clear — we have to side with Big Solar.  But if utilities were actually up for creating a different — and much more scalable — renewable energy policy, then the decision of who to work with becomes more challenging.

With revenue decoupling in place, utilities like Xcel could have more room to consider policies that support distributed generation, without seeing them as an axiomatic threat to their revenues.  But to do so, they’d have to be willing to talk about unwinding their existing investments in fossil generation — otherwise, no renewable or distributed generation policy can scale up far enough to be “good enough” for the climate.  That vital discussion about unwinding fossil plants is not yet happening out in the open.  At least, not in the US.  We’ll take a much closer look at it in a post very soon!

A Decoupling Update

So, it’s been quite a while since our last long policy post, focusing on utility revenue decoupling in connection with Xcel’s current rate case (14AL-0660E) before the Colorado PUC.  That’s because we’ve been busy actually intervening in the case!

A Climate Intervention

We filed our motion to intervene in early August.  As you might already know, in order to be granted leave to intervene, you have to demonstrate that your interests aren’t already adequately represented by the other parties in the case.  Incredibly, CEA’s main interest — ensuring that Colorado’s electricity system is consistent with stabilizing the Earth’s climate — was not explicitly mentioned by any of the other parties!

In our petition we highlighted our mission:

…to educate the public and support a shift in public policy toward a zero carbon economy.  CEA brings a unique perspective on the economics of utility regulation and business models related to mitigating the large and growing risks associated with anthropogenic climate change.  In addition, CEA has an interest in transitioning away from fuel-based electric generation in order to mitigate the purely economic risk associated with inherently unpredictable future fuel costs.

…and we were granted intervention.  So far as we know, this is the first time that concern over climate change has been used as the primary interest justifying intervention at the PUC in Colorado.  In and of itself, this is a win.

A Long and Winding Road

Throughout the late summer, we spent many hours poring over the thousands of pages of direct testimony.  Especially Xcel’s decoupling proposal, but also (with the help of some awesome interns), the details of the company’s as-of-yet undepreciated generation facilities — trying to figure out how much the system might be worth, and so how much it might cost to just buy it out and shut it down (were we, as a society, so inclined).

Early on in the process, the PUC asked all the parties to submit briefs explaining why we thought it was appropriate to consider decoupling in the rate case, whether it represented a collateral attack on decisions that had already been made in the DSM strategic issues docket, and how it would interact with the existing DSM programs.  We pulled together a response, as did the other intervening parties, and kept working on our answer testimony — a much longer response to Xcel’s overall proposal.  The general consensus among the parties that filed briefs, including CEA, SWEEP, WRA, and The Alliance for Solar Choice (TASC, a solar industry group representing big installers like Solar City) was that decoupling was not an attempt to roll back previous PUC decisions related to DSM — and that addressing it in a rate case was appropriate.  Only the Colorado Healthcare Electric Coordinating Council (CHECC, a coalition of large healthcare facilities and energy consumers) told the PUC that decoupling ought to be considered an attack on previous DSM policies.

The PUC staff unfortunately came back with a reply brief that disagreed and suggested, among other things, that maybe it would be better if we just went with a straight fixed/variable rate design to address utility fixed cost recovery.  Never mind the fact that this kind of rate would destroy most of the incentives customers have to use energy efficiently.

And then we waited.

With baited breath each Wednesday morning we tuned in to the Commissioners’ Weekly Meeting, streaming live over the interwebs from the Windowless Room in Denver.  We watched regardless of whether anything related to our dear little 14AL-0660E was on their agenda.  Just in case they tried to sneak it by.  Weeks passed.  And then a month.  The deadline for submitting our answer testimony approached.

Finally on October 29th, six weeks after submitting our brief, the commissioners finally brought up the issue of decoupling at their weekly meeting and in a couple of minutes, indicated that they’d be severing it from the proceeding, with little explanation as to why.  However, because there were no details, and the order isn’t official until it’s issued in writing… we continued working on our answer testimony.  The final order came out on November 5th, and prohibited submission of testimony related to decoupling.  Answer testimony was due on November 7th.

Where to From Here?

Xcel might come back to the PUC with another decoupling proposal before the next Electric Resource Plan (in fall of 2015) .  Or they might not.  This means that a good chunk of the work that we’ve been doing since this summer will have to come to light in a different way.  So for the next few posts, we’re going to explore some of the issues that came up in the preparation of our answer testimony, including:

  • Decoupling and Distributed Energy:
    How would decoupling interact with distributed energy resources like rooftop solar?  What are the implications for utilities as the costs of those resources continue their precipitous decline?
  • Decoupling and Demand Side Management:
    How would revenue decoupling interact with demand side management programs in general — both utility and privately or locally funded — and what particular issues with Xcel’s DSM programs could decoupling address?  What issues can’t it help address?
  • Can Revenue Decoupling Scale?
    Why doesn’t revenue decoupling as a policy really scale up to the point of  taking existing generation facilities offline, or preventing new facilities from being built?
  • Decoupling as a First Step:
    Even if it can’t scale, why might decoupling still serve as a useful starting point for the decarbonization process? Can it give us a little bit of breathing room while we start the real negotiation? Or is it just another layer of financial protection for utilities who want to delay change as long as possible?
  • Realism and Equity in Carbon Budgets for Colorado:
    What is the true scope of the decarbonization challenge, in the context of the carbon budgets recently published by the IPCC in their Fifth Assessment Report (AR5), but localized to Colorado so we can actually wrap our heads around it.  Why is this conversation so hard?

Learn more about utility revenue decoupling on our resource page…

Featured image of binders (full of PUC filings…) courtesy of  Christian Schnettelker on Flickr. Used under a Creative Commons Attribution License.