Friday, September 30, 2005

CIA's In-Q-Tel invests in Tendril

Following on the heels of the Sensicast funding announcement earlier this week, Tendril Networks announced an investment and business relationship with In-Q-Tel, the CIA's venture finance organization. Financial details weren't released.

Sensing technology has been developing rapidly (in terms of sensitivity, lower power needs, better automation, etc.), and at the same time so has machine-to-machine communications technology, in terms of hardware, software, standards development, etc. The Sensicast and Tendril announcements this week aptly illustrate the interest by funders who see opportunities with such startups to accelerate adoption of these technologies, for applications in environmental monitoring, manufacturing efficiency, energy and water distribution system automation, and other cleantech investment areas.

Tuesday, September 27, 2005

Sensicast, Nysa Membrane, and six months

  • Sensicast, which provides wireless communications for sensor networks for industrial and energy-system applications, announced a $13M round led by Global Environment Fund, along with existing investor Ardesta. Sensicast is in trials with GE and others, and is developing applications aimed specifically at boosting energy efficiency.
  • Burlington, Ontario's Nysa Membrane Technologies announced a C$2M Series A. The company promises "truly disruptive" performance from their filtration and separation technology. Initial applications appear to be largely targeted at the medical markets, but membrane technologies are often also applicable for cleantech uses, and the company is specifically targeting food and beverage separation, which can be very wasteful processes. The round was led by Golden Horseshoe Life Sciences and Emerging Technologies Venture Fund (GHFV) and the Technology Seed Investments Group of the Business Development Group of Canada (BDC), and also participating was MedInnova Partners Inc. (MPI).
  • Finally, last week marked the six month anniversary of the start of this site. It's certainly been more widely-read than I would have ever expected. According to the (somewhat incomplete and overlapping) data the various trackers provide, there have been almost 30,000 hits on the site over the past six months, or an average of 160 per day. I certainly hope this site has been helpful for cleantech investors and those interested in cleantech venture investing -- please don't hesitate to contact me with your tips, feedback, and suggestions. Thanks for visiting!

Thursday, September 22, 2005

The intelligent grid pt. 3: Transmission capacity

More answers to the questions posed by Matt Marshall, who wrote:

“Locally, [the intelligent grid] is significant because the California Independent System Operator, which oversees most of the state's electricity system, has just approved a $300 million transmission line that brings power into S.F. from Pittsburg under the S.F. Bay. That's a whopping-big line. Sounds like the opposite of the Intelligrid, but maybe we're missing something.”

Looking at the chart of electricity prices mentioned in the last post (again, you have to scroll down on the page to see the chart), it's striking how much prices vary across the country. Why is that? Because of lack of transmission capacity. Hydro power is (once the dam is built) really cheap. Coal power is relatively cheap. Natural gas-fired power is not nearly so cheap. So areas with access to cheap power (e.g., the Pacific Northwest's hydro, or the Midwest's coal-fired plants) tend to have the lowest cost of electricity.

But if our major transmission grid worked efficiently, we would simply move power to where it's needed, from where it's cheapest to produce. And then prices would self-adjust, right? There would still be some slight variation in pricing due to losses during transmission, but they would be minor, not the 2x or more variation seen in the chart.

There are two macro problems with the current layout of the tranmission grid -- there's not enough capacity, and there's not enough connectedness.

With enough capacity, you could ship all of the cheap hydro power from the Northwest down to where it's most needed in California. Some of that does happen (and energy traders make good money off of facilitating that). But just look at the prices -- clearly, it's not done efficiently. Indeed, we don't really have one single American power grid. We have many different grids tied together at "interties" with very limited capacity. Anyone who wants to learn more about the power congestion situation should definitely check out this report from the DOE (pdf).

Besides the lack of capacity, the grid is not as well meshed as would be hoped. In an effective mesh system, when one link goes down, the load is easily apportioned out across several other links, and power still gets to where it needs to be without significant strain on the system. But no one started out building a national grid that would resemble such a perfect mesh system, so we don't have one. Instead, there are a few critical links out there that, if they go down, take a lot of the system with them. And you get what you had in LA last week, where someone cuts the wrong cable and a major city loses power.

We also need the links in the grid to be smarter and more efficient, but that's to be discussed in the next and final post of this series, where we'll describe some of the technology areas where innovative startups are addressing the needs of the intelligent grid, and finding success.

Let's go back to the question implied by Matt from above: Should we spend $300M on transmission capacity, or should we spend it encouraging more distributed renewable generation?

To which the cleantech investor says, “Yes.”

But joking aside, I’ll simply point out that solar resources and wind resources, just like hydro power resources and coal-fired generation plants, are not evenly dispersed across the country. So even if you encourage distributed, renewable energy generation, again you have the situation where power can be generated in one place a lot cheaper than it can be produced elsewhere. And it so happens that the best places for solar (e.g., deserts) and wind (e.g., offshore, and the Great Plains) aren’t really close to consumers. Indeed, some pundits say that the biggest obstacle to even more adoption of wind generation (as if it wasn’t already growing incredibly rapidly) is lack of adequate transmission capacity in locations where wind farms can be best built.

So while there’s a strong case to be made for further building out renewable distributed energy generation capacity, there’s a strong case to be made as well for building out transmission capacity and interconnectedness.

The intelligent grid pt. 2: Distributed generation and the grid

Following up on the first part of my answer to the questions posed by Matt Marshall, where we discussed why things change slowly at utilities and on the grid.

In his smart column, Matt describes a vision of an electricity transmission and distribution grid that is strengthened by more use of distributed generation sources (hopefully powered by renewables). As he writes:
"Instead of hundreds of power plants, such as giant 1,200 megawatt nuclear power plants, powering our needs, we should have millions of solar powered residences and workplaces -- or at least a power source closer to their destination -- so that a good fraction of the power doesn't get wasted in transportation. And that way, a single error at one plant, or cut in an electrical transmission line, doesn't shut down the power of 2 million people, as it did last month yet again in Southern California. Utilities would have to buy into idea, which arguably is against their interests -- they'd lose control."
So is the lack of an intelligent grid holding back such a vision? The answer is both yes and no.

What's holding back distributed generation?

Yes, there isn't enough use of distributed generation yet. And I would also throw demand reduction into the mix -- if the idea is to reduce our dependence on an outdated transmission and distribution grid, even better than generating power close to demand would be not needing it in the first place, or being able to reduce demand instantly in response to tight supply or problems with the grid.

Not all utilities have bought into the concept yet, it's true. Systems engineers are naturally reticent to deal with the complications that result when thousands of small generation sources start to interact with an already hard-to-control, finely-balanced grid. And some utilities have taken this view in the past and tried to make it difficult for end users to set up distributed generation systems.

It's important to realize however, that much of the policy framework is in place to enable more use of distributed generation. Across the U.S., federal law requires that anyone be able to sell such excess power at a minimum of wholesale (the price a utility pays) prices. As this chart shows, in 37 states, some form of "net metering" is allowed by law. Net metering allows anyone who generates more power than they themselves need to sell their excess power back onto the grid at retail (the price you and I pay) prices.

Now, in a lot of these states there are a lot of exclusions and limits, and some states don't allow net metering at all, so there's a lot of room for further change. But importantly, in major markets like California (solar and wind net metering allowed for all customers with generation <1mw), href="http://www.utilipoint.com/issuealert/article.asp?id=2569">IssueAlerts from Utilipoint, among other sources. But the above should help illuminate that, while there's certainly room for some policy shifts to encourage the intelligent grid, that's not the whole story.

Costs and awareness

What really held back distributed generation and distributed "demand response" activities were two things: The cost of the energy produced, and lack of awareness.

Most distributed generation technologies are just now getting to be price competitive with retail electricity rates, and only in the higher priced areas. As the chart on this page shows (scroll down), retail prices vary widely across the U.S. In whatever region you're located, while it certainly helps ease concerns over over-building a system and thus wasting capacity, you're really only going to build significant distributed generation capacity to sell back into the grid if the price you'll receive makes it worth it. So, for instance, in California you need to be able to make the electricity for a net cost of around 12 cents per kilowatt hour or less (yes, this is an oversimplification, but please allow the bigger point...). As this site shows, renewables are now barely reaching this level:
  • Solar is between 20-40 cents per kwh
  • Microturbines and some fuel cells are between 10-15 cents per kwh
  • Wind is between 5-10 cents per kwh, but with large-scale turbines (that few would park in their back yard)
For much of the country, these costs are still higher than retail electricity prices.

Furthermore, while net metering policies often allow such distributed generation to be sold back onto the grid at retail rates, if everyone starts doing it, utilities are going to clamor for some relief -- they do, after all, maintain the transmission and distribution grid for everyone else, and if they pay for electricity at the retail rate, and sell the electricity at the retail rate, then it gets pretty tough to pay expenses. This then bogs down in a policy and regulatory debate best left to other venues than this site. But suffice to say, it's likely that the price distributed generators will be able to get for their excess power is likely to go down over time, if distributed generation rises in usage.

But of course, the cost of power by emerging distributed generation technologies is going down rapidly, too. And in many places policies are in place to subsidize them further, as well. Which is why the market for solar, wind, fuel cells, etc. is growing so quickly.

The other real issue is awareness. Companies don't realize that there are vendors out there now who have developed systems to help them significantly reduce their energy usage at peak-demand times, without them even being able to tell that a change took place. Homeowners simply don't realize that, by taking advantage of subsidies and net metering, they can often save money in the long run by filling their roof with solar panels (e.g., 10 year payback or better, compared to a 20 year system lifetime), or by putting in another type of grid-tied generation system. Or they may feel that it is an aesthetically unappealing proposition. But this lack of awareness, too, is changing rapidly.

So with policies already somewhat in place, attitudes and awareness changing, and costs coming down rapidly, it's easy to understand why emerging distributed energy technologies are getting a lot of interest from the venture investing community. And why the intelligent grid is coming, albeit slowly.

Tuesday, September 20, 2005

A few quick items

  • Plasco Energy, which has developed plasma arc technology for waste-to-energy and coal treatment applications, announced they raised a C$7.3M round (including C$4.5M from lead funder Killick Capital) to fund facilities in Ottawa and Barcelona. A pdf of the announcement is available here.
  • Today's fun read: This article from the Montreal Gazette about attempts to create add-ons to the internal combustion engine to make it burn more efficiently. Certainly, the internal combustion engine isn't going away anytime soon.
Free registrations required for many of the above links...

The intelligent grid is alive and well... but maturing very slowly

Matt Marshall, of the San Jose Mercury News and SiliconBeat, asks the question (directly of us, it must be pointed out), "is [the intelligent grid] really just politically dead in the water?"

In this post and a few to follow over the next couple of days, I hope to share with readers my viewpoint that the intelligent grid is indeed alive, and that there are innovative cleantech VC-backed companies poised to make a big push forward as part of the grid's evolution, with big market growth potential -- but we should also expect it to develop very slowly, and this has significant implications for venture investors as well.

The background: Why things move slowly

In this post, let's just discuss a little background. First of all, what is the intelligent grid? EPRI's IntelliGrid effort provides a good scenario-based overview, and Matt's post discusses a vision for the intelligent grid that places more emphasis on distributed generation. Essentially, when we talk about an intelligent grid, we're talking about an electricity transmission and distribution (and to an extent, generation) system that is "smart" enough to recognize potential problems, communicate such conditions to a central decision-maker (ie: computer), and automatically correct for the problems. In the EPRI scenario, a major transmission line outage is detected, corrected for by a number of means, and adapted to until a permanent fix to the transmission line can be arranged.

Many readers may assume that this is the way our electricity grid already works. After all, we expect that when we flick a switch, our lights will come on, and (with rare but sometimes spectacular exceptions) they generally do. We know it's a complex network, therefore it must be controlled by a smart, centralized, automated decision-making system. And computers are everywhere these days, right? But the simple fact is that our electric system wasn't designed for this level of complexity and interconnectedness, it is not very well automated, and it's frankly amazing (and a credit to the transmission engineers out there) that this somewhat archaic network continues to work while being held together by bubble gum and baling wire (metaphorically speaking, of course... well, kind of).

Just to give a small tangible sense of the magnitude of the problem: For many neighborhoods around the U.S., do you know how a utility knows you have a power outage on your block? No, they don't get a signal back at "home base", much less an automated "fixit" message to work crews. Instead, they depend upon getting a telephone call from you, complaining that your lights are out. Then they know to begin looking for a problem.

Oh, and then there's the fact that the system is outdated and starting to fall apart. It's worth repeating Nancy Floyd's quote from Red Herring, also quoted by Matt:
The big infrastructure problem is the aging grid, and the whole automation area. The average age of transformers is 38 years, and their design life is 40. Almost every week transformers explode, causing outages, costing money, even killing people. And how do they find out if a transformer is going to fail? They send someone out to take a sample of oil from the transformer and send it to a lab to get results a week later.
So you can see why many have called for the development of an intelligent electrical grid. As we'll discuss in a later post, today's grid isn't set up for tomorrow's power supply. And it's just a matter of time before another bad blackout happens. And then another after that. A smart grid could help address this.

Matt posits that utilities seem to be resistant to the smart grid concept. But here's the first important thing to know about our electricity network:

Utilities are profit-maximizing, but they don't make money in the way you would expect

How does a for-profit electric utility make money? First of all, you have to understand that they are regulated by a state's public utility commission (PUC). The PUC is either directly elected by the populace, or its members are appointed by politicians -- so the PUC is highly sensitive to political forces. The PUC usually grants the utility a rate of return on assets. Then, they negotiate with the utility to see how many assets there should be. Then, they negotiate with the utility to see what the expenses should be. Finally, they determine a rate "schedule" (ie: a set of different rates for different types of customers) that allows the utility, with an established set of assets and an agreed-upon estimate of costs, to achieve their rate of return.

As you might imagine, both the PUC and the utility have a lot of various factors to consider. The PUC's job is to a) make sure that the lights stay on; but b) keep rates low. This sets up the strong incentive to keep assets and expenses as low as possible, and to make sure that there is very little "risky" activity going on that could lead to a problem. I.e.: "We don't want the utility to experiment with new technology."

At the same time, the utility's profit motive to push costs down as much as possible is somewhat undermined. Once a "rate case" has been established with the PUC, it's usually a couple of years before the next one. So if the utility can lower their expenses during that time, they might be able to sneak some extra profits in the short term. But the next time around, the PUC will just adjust their expectations of expenses down, and lower the rates accordingly, so any profit to the utility from lowering expenses is short-lived. And those lowered expenses usually mean layoffs -- for instance, if you put in an automated meter reading system so that you don't need people physically walking from meter to meter to read each household's electricity usage each quarter, that could mean big savings... because you get rid of the meter readers. Such is the price of progress in a normal industry, but in an industry that is so politically managed, it's not a very popular move. So why invest in a technology to improve reliability or otherwise reduce costs, if it's not going to gain much in terms of long-term earnings, and it's going to make you some enemies? Thus, the utility becomes reticent to try new technology, even when it would save some money.

So as you can see, both the utility and the PUC have an incentive to save costs, but mostly only in the very short-term, and certainly not to the extent that most commodity industries (hey, electrons are electrons) would experience. At the same time, they're reticent to invest in new capacity and backup plans unless and until absolutely necessary. There are very real reasons for such a system to be in place, based around the difficult economic concept of the natural monopoly, and this site is not going to address the public, political debate around the design of the rate-setting system. But from the above, you can see that it is an important factor in how things work, and how fast things change.

At the level of the transmission grid (so: larger power lines, not the neighborhood-level distribution network), it's even more complex. There are Regional Transmission Operators (RTOs), quasi-governmental organizations running their own transmission networks (e.g., the Bonneville Power Authority), energy traders, utilities both for-profit and not-for-profit, Independent Transmission System Operators (ISOs), etc., all making investment and usage decisions. You have spot markets for power, but only at critical junctions, and then you have to pay additional rates for the right to transport that power to wherever it's actually needed. And on top of all that, the lines have to maintain a fairly precise balance at all times -- you can't let the load get too high on any single link of the transmission network, or it could go out. And then you get a Northeast Blackout scenario. To get just a sense of the complexity, see how the Western Area Power Administration describes their role... Essentially, our transmission grid is an inherently chaotic, finely-balanced system, which a disparate set of decision-makers are managing (barely) to keep under control.

Which brings us to the second important piece of context:

Engineers at utilities are very slow to adopt new technology

And for good reason. Utilities are supposed to provide power when and where it's required. That's their mandate, and the reason they are granted the ability to operate as local monopolies. Therefore, best way to screw up a perfectly good 40+ year career as a dedicated utility engineer is to install a system or otherwise allow a change in the system that causes the lights to go out.

So put yourself in the shoes of an electricity transmission engineer, and imagine a small startup approaches you about putting an automated decision-making unit out in the field, controlling a critical relay point, in response to market signals or some such. Your desktop PC crashes once a week (utilities also don't invest much in IT), the startup doesn't have any other big customers yet, and yet they're asking you to hand over a part of your network to a computer that will be doing things out of your control. No thank you. Similarly, in a topic we'll discuss more in a later post, these engineers are going to be naturally reticent to enabling a bunch of random small generators (ie: solar panels on houses, fuel cells at Cisco, etc.) to be able to feed power into the grid in ways out of the control of the centralized engineers, no matter how much sense it makes at a high level.

When considered in light of the larger economic incentive problems described above, it's easy to see why a) selling a new technology to utilities is a chicken and egg problem -- they all want to be the 3rd utility to use a product, definitely not the first; and b) even when you do get a utility interested, they're going to perform a pilot project, and then another pilot project, and then another pilot project, and then a limited roll-out, etc. And thus, the sales cycle can easily take years.

Which brings us to the final point to address in this already too-long post:

Everyone admits the system has to change, but no one agrees on exactly how

The obstacle in fixing all of this is not necessarily political feasibility. Yes, PUCs don't want to have to raise rates. But they also recognize that the system is getting to be unsustainable. Governmental groups from the Western Governors' Association to the Federal Energy Regulatory Commission (FERC) to state legislators to the U.S. Congress have all recognized that we need to upgrade our network. There are a lot of competing ideas about how to do this. People are committed to making it happen. There have been some really promising leadership efforts by some ISOs and utilities.

But there is also a lot of regulatory uncertainty. Attempts were made to deregulate the network, in the hope that that would provide better incentives for utilities and operators to make investments. Ask any Californian how that worked out. So now such deregulation efforts have been drawn back. Laws passed by Congress have created new types of quasi-governmental organizations, which are still being formed and developed. In such an uncertain regulatory and political environment, I've worked with utilities who didn't want to make investments in transmission infrastructure simply because they weren't even sure at the time that the state wasn't going to be taking it from them. Will utilities be forced to sell their transmission lines to the RTOs? At what price? Again, not a topic for this site to discuss, but readers can get a sense of the somewhat paralyzing uncertainties.

At the same time, there's also a lot of technology uncertainty. Granted, it's easy to say that any improvement is a worthwhile improvement. But think back on Nancy Floyd's powerful quote from above -- these utilities are making 40 year investments in as-yet unproven technologies. And even if the technology seems proven, will another technology that comes along a year later blow it away? For example, it's relatively easy to design an electricity meter that can communicate with the utility, so that you can get automated meter reading (AMR). But what communication link are you going to use? Cellular technology? The existing phone line to the home? Broadband over powerline? If you choose the wrong one, there could be trouble both with your customers and with the PUC. And then you have to integrate your meter reading systems with your IT infrastructure. And also, you have to make sure that the system you use to roll out AMR to the first third of your territory today will work with the system you choose when you roll out meter reading technology to the last third in five years.

In such an environment of uncertainty, of course utilities are going to move very slowly. But the political will may be there to force faster changes. For example, in Ontario (the US and Canadian power grids are interconnected and similarly managed, so it's a relevant example) they are requiring smart metering across the entire province by 2010, with smart grid applications in mind. California is moving similarly. And new incentives are being put in place to encourage more investment in transmission capacity.

It's just going to take a long time.

I hope this first post on the topic provides some useful context. In following posts over the next couple of days/ weeks (things are very busy, sorry), I'll try to more directly address the topics of distributed generation and cleantech and the smart grid, and also what VC-backed companies are already doing in moving the idea of the smart grid forward. But given the question that Matt asked about political feasibility of the smart grid, I hope the above provides some context for thinking about the question and its implications for investors. This post may appear a big discouraging -- but there are a lot of exciting things happening in smart grid technology that cleantech investors should be aware of, and we'll discuss some of that. It's not a question of "if", just "when," and "how".

[Update: Here are links to parts two and three of this series of posts]

Sunday, September 18, 2005

Q&A with Nancy Floyd

Those interested in cleantech investing will be interested in this interview with Nancy Floyd, one of the founders of the well-known clean energy-focused VC firm Nth Power.

SuperProtonic raises new round

Hydro Venture Partners, the US Army's OnPoint venture group, CMEA, Nth Power, Innovation Valley Partners and Battelle Ventures have invested a new round into SuperProtonic, which manufactures solid acid fuel cell technology. Details of the funding don't appear to have been disclosed. The company is still at a pre-prototype stage. Those curious about this form of fuel cell technology can read more about it here.

Friday, September 16, 2005

Cleantech investing = "whiteboarding"?

See this interesting article from BusinessWeek which describes how mainstream VCs are now searching for new industries in which to invest, as they look to avoid focusing too narrowly on IT and telecom. And so, naturally, in an era of high energy costs they have been drawn to emerging energy generation technologies and related investment areas. As the article describes (and as regular readers of this site will agree), it's difficult to find any large traditional VC firm that hasn't at least considered an investment in solar power recently. (Which may make one read the quote about "contrarian investments" a bit differently...)

This raises a number of questions that dedicated cleantech investors should be asking themselves:
  • Are the traditional telecom/IT/biotech VCs moving into cleantech permanently, or is this a temporary trend driven by a few timely factors such as the high price of oil?
  • Are the traditional telecom/IT/biotech VCs focused primarily on solar right now because it is best-positioned for a near term breakout among the many different cleantech investment areas, or is it a bit of a "herd mentality" effect at work? If the latter, will these traditional telecom/IT/biotech VCs eventually look more deeply into the other potentially lucrative cleantech investment segments, and if so, when?
  • Is this a threat or an opportunity for dedicated cleantech investors?
My personal belief is that the trend of very large VCs looking to bet broadly across a wide range of investment areas is a permanent one. Unless LPs begin to invest less money into traditional venture capital (and that doesn't appear to be something that will happen anytime soon), there will be a need to look for new places in which to deploy that capital. The recent news about rising valuations is a testament to that. And the cleantech investment thesis makes too much sense to be ignored. So I do think that the traditional telecom/IT/biotech VCs are in cleantech to stay. However, the depth to which they are able to engage in cleantech will always be somewhat limited -- the nature of being a generalist vs. being a specialist. Thus, these investors will tend to invest in areas where their generalist breathren are investing (ie: solar today), and only slowly expand into other areas.

However, as Patrick Ennis' quote smartly points out in the BW article, "All the great, innovative investments are always in areas that the herd is avoiding." Thus, visionary telecom/IT/biotech VCs will look to partner with, learn from, and co-invest with dedicated cleantech VCs who can point them in the direction of "unsexy" investment areas with strong near-term potential, and who have the industry knowledge and connections to be able to both identify well-positioned companies and help them grow. The broad-based VCs will bring the deep pockets, and the dedicated VCs will bring the expertise. And in the meantime, LPs will also continue to seek further diversification by investing with both broad-based and dedicated cleantech venture firms.

Thus, to answer the last question, this trend probably is a terrific opportunity and legitimization for the dedicated cleantech investing community. ...But not without some hiccups, undoubtedly.

Thursday, September 15, 2005

Some administrative notes

  • http://cleantechvc.blogspot.com has always been a bit of a mouthful of as a web address, hard to say, and harder to remember. As the interest in this site has grown significantly over the past 5 and a half months, it began to be a bother, so starting yesterday www.cleantechvc.com now gets you here as well. Hope that makes things easier for everyone...
  • Even easier to get to: Very happy to report that this column is now being syndicated on the Cleantech Venture Network's site, along with Tyler Hamilton's insightful Clean Break blog and columns by Nick Parker and Keith Raab of the Cleantech Venture Network. The team at the Cleantech Venture Network does terrific work, and my firm Expansion Capital Partners has had a terrific relationship with the group since their formation, so it's terrific to have this personal link as well.
  • Semi-regular reminder: Please donate to Katrina relief efforts, such as those of the Red Cross and others...

Chrysalix announces three investments: Cyrium, SiM Composites, and Ardica

Haven't been able to find it posted anywhere on the web yet, but today the clean energy VC firm Chrysalix Energy Management announced three new investments:
  • SiM Composites, which the Chrysalix release describes as "creating unprecedented proton exchange materials based upon multifunctional silica and polymer." Mike Walkinshaw of Chrysalix is quoted as predicting that,
"We expect to see SiM's membranes revolutionize many proton exchange membrane markets due to their improved performance and lower cost. Their technology could have a significant impact on fuel cells for stationary and automotive applications as well as membranes for water desalination."
No information on the financial details of these rounds was provided in the release.

[Note: edited 9/19 to correct some information from the companies]

Wednesday, September 14, 2005

AgION raises $3M Series D

AgION, which was recently named one of Red Herring's "100 Hottest Private Companies In North America" (and we also discussed them here), announced a $3M raise from strategics BASF Venture Capital and HB Fuller Ventures. The company's proprietary technology for impregnating materials with ionic silver is used to fight microbial growth in a number of applications, including water filtration, food packaging, medical devices, etc.

In all, the company has raised $40.5M since 1997.

Altela raises $750k

Altela, an Albuquerque-based company whose proprietary technology helps to clean the waste water from oil and gas extraction, announced they raised a first funding of $750k.

Cleantech investors and observers may reasonably disagree about the inclusion of any oil and gas extraction-related technology under the "Clean Technology" umbrella. Some may feel that any technology which makes oil and gas extraction more efficient isn't "clean". Others (and I lean this way myself -- but please note the disclaimers all over this site) would note that oil and gas extraction is not going to go away anytime soon, so as long as we're going to be extracting it out of the ground, doing it in such ways as to minimize the negative ecosystem impacts is at least a "cleaner technology" that fits well within the cleantech investment thesis. Domestic oil and gas extraction produced 756 billion gallons of waste water in 1995...

Certainly it is a potentially lucative investment area, with current disposal costs at $5/barrel (each barrel is 42 gallons). Investors in this round include EnerTech, Verge venture fund, and local angel group Sandia Capital Partners LLC.

Tuesday, September 13, 2005

Cleantech Venture Network: Cleantech VC investments up 32% in Q2

Cleantech VC investments grew 32% year on year, according to an announcement released today by the Cleantech Venture Network. In all, $369M was invested in 55 identified deals in North America and Europe.

Energy investments, unsurprisingly, made up the majority of the total.

It should also be noted, following up on my post from back during the slow period in August, that indeed cleantech dealflow has picked up again as we've moved into September, as regular readers of this site will have already observed...

[ed. note: changed as of 9/13 to correct an inaccuracy in my original post... rd]

Friday, September 09, 2005

Friday Linkin' Around

  • And here's another piece that explicitly links the Katrina events to an upsurge in interest in clean energy. [Daily reminder: Give to Katrina relief efforts, such as those being undertaken by the Red Cross here]
  • Are we nearing a new era of building coal-fired power plants? That could have significant implications for a wide range of companies in the cleantech universe: Positive, for those that address coal-fired power technologies specifically; and perhaps Negative, for those in other technology areas who are counting on the cost of wholesale grid electricity rising significantly.
  • Finally, I stumbled upon the BlogShares site for Cleantech Investing. I have no idea what this is, how blogs are valued, or why anyone would be trading virtual shares of blogs to begin with, but it looks entertaining... If anyone out there is making money off this thing, please donate to the relief effort?

Thursday, September 08, 2005

Zero-point energy article in the SF Chronicle

Here's an entertaining article from yesterday's Chronicle, regarding local startup Magnetic Power (which, not coincidentally, is currently looking to raise capital).

In case you're interested in learning more about the concept of zero-point energy, here's the Wikipedia entry.

Wednesday, September 07, 2005

Advanced materials: NanoSteel

Advanced materials can be an interesting investment area for cleantech investors. Intersecting somewhat with nanotech and other cleantech segments, breakthroughs in advanced materials can yield products that have lower toxicity, lower costs, and/or greater efficiency in usage than existing materials.

One case in point is today's announcement that EnerTech and MILCOM Technologies have invested in a "midsized round" of financing for NanoSteel. NanoSteel's proprietary technology for steel fabrication results in lower weight, stronger steel with less chromium exposure for those that work with the metal.

Here's a great quote from Vincenzo LaRuffa of EnerTech:

"I think a really interesting application in particular is reducing the amount of steel in an automobile, and the massive fuel savings that could lead to. While a bit on the horizon, it was very core to our investment thesis. In many ways it’s a far more compelling solution than hybrids or fuel cells."


  • Reminder: You can donate to the Red Cross in support of those affected by Katrina here.

Tuesday, September 06, 2005

Powerlight raises new funding

Dow Jones' Venture Reporter is reporting that Powerlight, the bay area solar installation company, has raised $3M from investors including JP Morgan's Bay Area Equity Fund. We'll provide more information on this round as we get it...

Saturday, September 03, 2005

Kleiner $3M investment in EEStor?

Here is a DealFlow column in Business Week that reports that Kleiner Perkins invested in a ceramic battery/ ultracapacitor company, EEStor, back in July, leading a $3M round. There appears to be very little information available on the web about the company, beyond what Business Week's Justin Hibbard has been able to dig up, but there's enough there to be pretty intriguing...

Friday, September 02, 2005

Ethanol, and solar IPOs

  • First of all, if you haven't read today's PE Week Wire, do so. Amen, Dan.
  • Q-Cells, a German solar PV manufacturer and a partner of US PV manufacturer Evergreen, is reportedly considering an IPO -- part of a broader trend of solar manufacturers looking to take advantage of the industry's recent strong growth and even stronger investor interest.
  • New Energy Capital, backed by Vantage Point and CalSTRS, announced the close of funding for the Iroquois Bio-Energy Company, an Indiana ethanol plant. The total project is $70M to build a 40m gpy capacity, and is being managed by New Energy Capital. A pdf of the press release is here.
  • Again, you can donate to the relief efforts here.
Unique visitors to dateLocations of visitors to this page