Tuesday, September 20, 2005

The intelligent grid is alive and well... but maturing very slowly

Matt Marshall, of the San Jose Mercury News and SiliconBeat, asks the question (directly of us, it must be pointed out), "is [the intelligent grid] really just politically dead in the water?"

In this post and a few to follow over the next couple of days, I hope to share with readers my viewpoint that the intelligent grid is indeed alive, and that there are innovative cleantech VC-backed companies poised to make a big push forward as part of the grid's evolution, with big market growth potential -- but we should also expect it to develop very slowly, and this has significant implications for venture investors as well.

The background: Why things move slowly

In this post, let's just discuss a little background. First of all, what is the intelligent grid? EPRI's IntelliGrid effort provides a good scenario-based overview, and Matt's post discusses a vision for the intelligent grid that places more emphasis on distributed generation. Essentially, when we talk about an intelligent grid, we're talking about an electricity transmission and distribution (and to an extent, generation) system that is "smart" enough to recognize potential problems, communicate such conditions to a central decision-maker (ie: computer), and automatically correct for the problems. In the EPRI scenario, a major transmission line outage is detected, corrected for by a number of means, and adapted to until a permanent fix to the transmission line can be arranged.

Many readers may assume that this is the way our electricity grid already works. After all, we expect that when we flick a switch, our lights will come on, and (with rare but sometimes spectacular exceptions) they generally do. We know it's a complex network, therefore it must be controlled by a smart, centralized, automated decision-making system. And computers are everywhere these days, right? But the simple fact is that our electric system wasn't designed for this level of complexity and interconnectedness, it is not very well automated, and it's frankly amazing (and a credit to the transmission engineers out there) that this somewhat archaic network continues to work while being held together by bubble gum and baling wire (metaphorically speaking, of course... well, kind of).

Just to give a small tangible sense of the magnitude of the problem: For many neighborhoods around the U.S., do you know how a utility knows you have a power outage on your block? No, they don't get a signal back at "home base", much less an automated "fixit" message to work crews. Instead, they depend upon getting a telephone call from you, complaining that your lights are out. Then they know to begin looking for a problem.

Oh, and then there's the fact that the system is outdated and starting to fall apart. It's worth repeating Nancy Floyd's quote from Red Herring, also quoted by Matt:
The big infrastructure problem is the aging grid, and the whole automation area. The average age of transformers is 38 years, and their design life is 40. Almost every week transformers explode, causing outages, costing money, even killing people. And how do they find out if a transformer is going to fail? They send someone out to take a sample of oil from the transformer and send it to a lab to get results a week later.
So you can see why many have called for the development of an intelligent electrical grid. As we'll discuss in a later post, today's grid isn't set up for tomorrow's power supply. And it's just a matter of time before another bad blackout happens. And then another after that. A smart grid could help address this.

Matt posits that utilities seem to be resistant to the smart grid concept. But here's the first important thing to know about our electricity network:

Utilities are profit-maximizing, but they don't make money in the way you would expect

How does a for-profit electric utility make money? First of all, you have to understand that they are regulated by a state's public utility commission (PUC). The PUC is either directly elected by the populace, or its members are appointed by politicians -- so the PUC is highly sensitive to political forces. The PUC usually grants the utility a rate of return on assets. Then, they negotiate with the utility to see how many assets there should be. Then, they negotiate with the utility to see what the expenses should be. Finally, they determine a rate "schedule" (ie: a set of different rates for different types of customers) that allows the utility, with an established set of assets and an agreed-upon estimate of costs, to achieve their rate of return.

As you might imagine, both the PUC and the utility have a lot of various factors to consider. The PUC's job is to a) make sure that the lights stay on; but b) keep rates low. This sets up the strong incentive to keep assets and expenses as low as possible, and to make sure that there is very little "risky" activity going on that could lead to a problem. I.e.: "We don't want the utility to experiment with new technology."

At the same time, the utility's profit motive to push costs down as much as possible is somewhat undermined. Once a "rate case" has been established with the PUC, it's usually a couple of years before the next one. So if the utility can lower their expenses during that time, they might be able to sneak some extra profits in the short term. But the next time around, the PUC will just adjust their expectations of expenses down, and lower the rates accordingly, so any profit to the utility from lowering expenses is short-lived. And those lowered expenses usually mean layoffs -- for instance, if you put in an automated meter reading system so that you don't need people physically walking from meter to meter to read each household's electricity usage each quarter, that could mean big savings... because you get rid of the meter readers. Such is the price of progress in a normal industry, but in an industry that is so politically managed, it's not a very popular move. So why invest in a technology to improve reliability or otherwise reduce costs, if it's not going to gain much in terms of long-term earnings, and it's going to make you some enemies? Thus, the utility becomes reticent to try new technology, even when it would save some money.

So as you can see, both the utility and the PUC have an incentive to save costs, but mostly only in the very short-term, and certainly not to the extent that most commodity industries (hey, electrons are electrons) would experience. At the same time, they're reticent to invest in new capacity and backup plans unless and until absolutely necessary. There are very real reasons for such a system to be in place, based around the difficult economic concept of the natural monopoly, and this site is not going to address the public, political debate around the design of the rate-setting system. But from the above, you can see that it is an important factor in how things work, and how fast things change.

At the level of the transmission grid (so: larger power lines, not the neighborhood-level distribution network), it's even more complex. There are Regional Transmission Operators (RTOs), quasi-governmental organizations running their own transmission networks (e.g., the Bonneville Power Authority), energy traders, utilities both for-profit and not-for-profit, Independent Transmission System Operators (ISOs), etc., all making investment and usage decisions. You have spot markets for power, but only at critical junctions, and then you have to pay additional rates for the right to transport that power to wherever it's actually needed. And on top of all that, the lines have to maintain a fairly precise balance at all times -- you can't let the load get too high on any single link of the transmission network, or it could go out. And then you get a Northeast Blackout scenario. To get just a sense of the complexity, see how the Western Area Power Administration describes their role... Essentially, our transmission grid is an inherently chaotic, finely-balanced system, which a disparate set of decision-makers are managing (barely) to keep under control.

Which brings us to the second important piece of context:

Engineers at utilities are very slow to adopt new technology

And for good reason. Utilities are supposed to provide power when and where it's required. That's their mandate, and the reason they are granted the ability to operate as local monopolies. Therefore, best way to screw up a perfectly good 40+ year career as a dedicated utility engineer is to install a system or otherwise allow a change in the system that causes the lights to go out.

So put yourself in the shoes of an electricity transmission engineer, and imagine a small startup approaches you about putting an automated decision-making unit out in the field, controlling a critical relay point, in response to market signals or some such. Your desktop PC crashes once a week (utilities also don't invest much in IT), the startup doesn't have any other big customers yet, and yet they're asking you to hand over a part of your network to a computer that will be doing things out of your control. No thank you. Similarly, in a topic we'll discuss more in a later post, these engineers are going to be naturally reticent to enabling a bunch of random small generators (ie: solar panels on houses, fuel cells at Cisco, etc.) to be able to feed power into the grid in ways out of the control of the centralized engineers, no matter how much sense it makes at a high level.

When considered in light of the larger economic incentive problems described above, it's easy to see why a) selling a new technology to utilities is a chicken and egg problem -- they all want to be the 3rd utility to use a product, definitely not the first; and b) even when you do get a utility interested, they're going to perform a pilot project, and then another pilot project, and then another pilot project, and then a limited roll-out, etc. And thus, the sales cycle can easily take years.

Which brings us to the final point to address in this already too-long post:

Everyone admits the system has to change, but no one agrees on exactly how

The obstacle in fixing all of this is not necessarily political feasibility. Yes, PUCs don't want to have to raise rates. But they also recognize that the system is getting to be unsustainable. Governmental groups from the Western Governors' Association to the Federal Energy Regulatory Commission (FERC) to state legislators to the U.S. Congress have all recognized that we need to upgrade our network. There are a lot of competing ideas about how to do this. People are committed to making it happen. There have been some really promising leadership efforts by some ISOs and utilities.

But there is also a lot of regulatory uncertainty. Attempts were made to deregulate the network, in the hope that that would provide better incentives for utilities and operators to make investments. Ask any Californian how that worked out. So now such deregulation efforts have been drawn back. Laws passed by Congress have created new types of quasi-governmental organizations, which are still being formed and developed. In such an uncertain regulatory and political environment, I've worked with utilities who didn't want to make investments in transmission infrastructure simply because they weren't even sure at the time that the state wasn't going to be taking it from them. Will utilities be forced to sell their transmission lines to the RTOs? At what price? Again, not a topic for this site to discuss, but readers can get a sense of the somewhat paralyzing uncertainties.

At the same time, there's also a lot of technology uncertainty. Granted, it's easy to say that any improvement is a worthwhile improvement. But think back on Nancy Floyd's powerful quote from above -- these utilities are making 40 year investments in as-yet unproven technologies. And even if the technology seems proven, will another technology that comes along a year later blow it away? For example, it's relatively easy to design an electricity meter that can communicate with the utility, so that you can get automated meter reading (AMR). But what communication link are you going to use? Cellular technology? The existing phone line to the home? Broadband over powerline? If you choose the wrong one, there could be trouble both with your customers and with the PUC. And then you have to integrate your meter reading systems with your IT infrastructure. And also, you have to make sure that the system you use to roll out AMR to the first third of your territory today will work with the system you choose when you roll out meter reading technology to the last third in five years.

In such an environment of uncertainty, of course utilities are going to move very slowly. But the political will may be there to force faster changes. For example, in Ontario (the US and Canadian power grids are interconnected and similarly managed, so it's a relevant example) they are requiring smart metering across the entire province by 2010, with smart grid applications in mind. California is moving similarly. And new incentives are being put in place to encourage more investment in transmission capacity.

It's just going to take a long time.

I hope this first post on the topic provides some useful context. In following posts over the next couple of days/ weeks (things are very busy, sorry), I'll try to more directly address the topics of distributed generation and cleantech and the smart grid, and also what VC-backed companies are already doing in moving the idea of the smart grid forward. But given the question that Matt asked about political feasibility of the smart grid, I hope the above provides some context for thinking about the question and its implications for investors. This post may appear a big discouraging -- but there are a lot of exciting things happening in smart grid technology that cleantech investors should be aware of, and we'll discuss some of that. It's not a question of "if", just "when," and "how".

[Update: Here are links to parts two and three of this series of posts]

2 Comments:

Blogger Timothy said...

This is a very thorough treatment, but I think you have to discuss the political mandate of the utilities. That may be something we have to change. Regardless, it's the mandate that's going to drive what technology we choose and how we choose to implement it.

Otherwise I think we may end up with a system that does not address the entire power using community. Which is to say, 99 percent of us.

2:17 PM  
Anonymous James Eades said...

Matt;

Bringing intelligence to the distribution grid is a priority for every country.

Our company Telepathx has been working on this issue and all of the variables involved for the last five years, and yes I can confirm there where several key hurdles that needed to be cleared before anyone would consider adopting new technologies.

At the most basic level what kind of sensors need to be developed

You would think that monitoring power assets would mean having an unrestricted supply of power, this is not the case, connecting assets together via wires is the quickest way to cause an outage or get electrocuted, take your pick, you are either dead or out of business so sensors needed to be wireless sensor networks (WSN) with ultra long term DC or self powered.

Data overload

If you are managing a distribution network of say 100,000 utility poles the thought of bringing every asset on line (insulators, fuses, transformers and cables) would mean monitoring 500,000 to 675,000 plus individual assets, the good new is you got them online the bad news is your going to have to hire new staff to manage it.

Beyond the power and EMI issues this is one of the main reasons why we abandon true mesh network technology such as the 802.15.4 Zigbee protocol, to much useless data. Energy managers just want to know when, where and what is having a problem.

This is why re-active sensors based on RFID are ideal for monitoring the grid, because they activate at a known threshold and are completely self managing.

WSN to Cellular/mobile/Sat nets or fiber / BPL / PLC for backhaul

First adapting any of the above for the single purpose of bringing intelligence to the grid would cost U.S. energy companies alone 500 plus billion dollars just to procure and install, BPL/PLC technology is very expensive and labor intensive and the return on investment may never be realized. WSN’s to Cell or Sat networks are by far much easier to implement but still used for a single purpose would still be cost prohibitive.

This was perhaps the biggest hurdle of all, and we have overcome this issue by developing a business model that will bring intelligence to the distribution grid at virtually no cost to the energy distributors, for the simple reason we go beyond grid management to monitor other assets and infrastructure.

So bringing intelligence to the grid does not need to be troublesome or expensive in our case it just provides energy companies with new sources of income.

James Eades
President & CEO
Telepathx Ltd
Melbourne, Australia

8:09 PM  

Post a Comment

<< Home

Unique visitors to dateLocations of visitors to this page