Posted by: nialljmcshane | July 7, 2010

Update on real time pricing in Chicago

Back in May I posted the second part of an article entitled “So What is Smart Grid Anyway: Smart Meters and Demand Response” in which I discussed Chicagoland utility, Commonwealth Edison’s, variable rate pricing plan.  At that time the weather was relatively cool and in the sample data that was available the peak demand was predicted to remain low, with no periods in which the price exceeded the fixed rate of 7c per KwH.

The last few days in Chicago have been much warmer so I have provided updated sample data that more clearly illustrates the significant price variation that can occur.

Sample ComEd Predicted and Actual Pricing – July 7 2010.

Note the deltas between predicted and actual pricing for several hours, the daily peak of 16c in the hour ending at 11:00 am and the extended peak pricing into the evening hours.

Advertisements
Posted by: nialljmcshane | July 4, 2010

Engaging the Consumer in Smart Grid

In a recent article in Smart Grid News, Robert Dolin, CTO of Echelon, looks back at a smart meter deployment project initiated in 2001 in Italy by Enel, the dominant utility in that country.  This meter centric project included time of use pricing, integrated pre-pay and meter disconnect capability, support for in-home displays, power quality measurement and other features.  It involved deployment of 30 million smart meters and, although it cost € 2.2 Billion, the return on investment was € 500 million per year in operational savings to the utility so, at least in theory it paid for itself in a little over 4 years.  This project underscores a key point in the debate about Smart Grid:  this is good business for the utilities regardless of whether the customers are engaged.

Dolin goes on to ask the all-important question: what’s next?  In answering that question he points to the emergence of two way distributed communications and increased intelligence within the grid and consumer premises in what he refers to as Smart Grid 2.0.  A key element of this Smart Grid 2.0, according to Dolin is demand response which generates a lot of animated debate in Smart Grid circles and in consumer groups.  Dolin, however, brings to the fore an important argument concerning demand response that is very often missed in those debates.  Utilities have a regulated responsibility to deliver secure reliable power to meet demand at all times.  The effect of this is that utilities have to build generating capacity (or purchase contracts from independent generators) to accommodate peak demand.  As Dolin points out, in the US, utilities experience peak demand just 2% of the year but their ability to meet that demand accounts for a full 15% of their annual costs.   Regulated utilities are allowed to pass on these costs to consumers so when we think about the cost savings to consumers that result from demand response, we need to think not only in terms of the possibly minimal savings that accrue directly from the demand reduction at an individual consumer’s level but also the aggregated savings that accrue to all consumers by reducing the excess capital and operating costs associated with building or purchasing capacity to meet peak demand.

Phil Carson at Intelligent Utility Daily also jumped back into the debate over Smart Grid recently.  In an article entitled “Who ‘Believes’ in Smart Grid”, posted on July 1, he summarized a long discussion that occurred on LinkedIn’s Smart Grid Executive Forum.  You can read the article to see the comments that Carson found particularly insightful but I want to focus on just one of those.  One correspondent in the LinkedIn forum stated: “How about starting with a few simple things like giving the customer a rate and a bill they can understand?”

This is a key step in getting customer engagement.  How many utilities that are rolling out smart meters are using the data collected from those smart meters to inform the customers on their existing bills what their payments would be on various rate plans that are available to them?  I envision a bill which shows the actual billed cost on the customer’s current fixed rate plan with one or two alternative billing options showing alternate costs based on actual utilization by that individual customer under a time of use or critical peak pricing plan.

By doing this, the utility could educate customers about the true variable cost of electricity and either demonstrate to them the potential cost savings or prove to themselves that there is no cost incentive for the average customer in their service area in which case they need to find an alternative marketing strategy.

I participated in another long and sometimes strident discussion on the merits of time of use pricing on LinkedIn’s Smart Grid, AMI, HAN group.  Kat Shoa initiated the discussion with a reference to her article on achieving consumer behavior change around Smart Grid dynamic pricing.  In her article she notes that electricity prices are lower in the US than in many other parts of the world.  As a consequence of this low-cost market for electricity in the US, she predicts that engagement in demand response will be limited to high use commercial and industrial entities and a few highly engaged consumers.  Price rises that would make demand response attractive to residential consumers would likely spark serious backlash against the utilities that implement them.

Some of the comments in the LinkedIn discussion ranged from unequivocal (but also unsubstantiated) statements that time of use and critical peak pricing models were already obsolete, to nuanced discussions of the differences between real-time dynamic pricing, pre-determined time of use pricing and critical peak pricing based on day-ahead projections.  A particularly informative comment came from a contributor who referred to himself as a Dumb Old Utility Guy (DOUG) who demonstrated the sort of keen understanding of the industry that comes with many years of experience.  In comments similar to those referred to above from Robert Dolin, he noted that

“while there are fluctuations of energy cost throughout the day, week, month and year – the extreme cost comes from trying to serve load over the course of about 40 hours per year for most utilities, occurring for 4 to 6 hours per day on peak days. The transmission and distribution system is built for these hours and generation resources are purchased or committed to serve these loads. There may be something in it for industrial customers to manage load against system marginal prices for the next 1000 hours, but the residential customer can only be enticed to respond for these 40 hours. “

“It’s easy enough to select a representative price for these peak hours at the beginning of each summer and then work with customers to allow limited load curtailment during these brief periods. It’s best to give the customers a tight window, perhaps no more than four hours. If necessary, customers across the system can have staggered response times with some called on from 2 pm to 6 pm and others from 4 pm to 8 pm. “

“Residential customers have neither the time nor the interest to follow the daily fluctuations in energy prices. They will respond to a limited number of calls from a utility – if the reward from the utility is sufficient.”

In the middle of this discussion came news that the regulatory authority in Maryland had rejected Baltimore Gas and Electric’s rate case for cost recovery associated with a Smart Grid Initiative. Some in the LinkedIn discussion seized on this news as evidence to support their view that demand response based on variable pricing was dead but a closer reading of the decision reveals that, in fact, BGE’s application was seriously flawed and this was not in any way a rejection of Smart Grid, demand response or variable pricing by the regulator.

In their announcement the regulator stated “The Proposal asks BGE’s ratepayers to take significant financial and technological risks and adapt to categorical changes in rate design, all in exchange for savings that are largely indirect, highly contingent and a long way off. We are not persuaded that this bargain is cost-effective or serves the public interest, at least not in its current form.”

The BGE proposal, which would have cost $482 million up front and a further $353 million over the life of the program would have replaced all existing meters with smart meters, deployed a two way communications network between the utility and the consumer premises via the meters, and instituted a mandatory time of use rate schedule for the summer months.  The regulator found among other things that:

  • The proposed plan would not in and of itself achieve the results that BGE were claiming and that further expenditure would be required to automate the distribution infrastructure.
  • The scope and proposed price tag also did not include the integration of smart appliances within consumer premises.
  • BGE’s proposal for a customer surcharge to recover the costs of the program, as opposed to recovery in the rate base at a later date unfairly transferred all of the risk to consumers.
  • The BGE proposal was motivated by the availability of stimulus funding but, despite an approved stimulus grant of  $136 million from the federal government, the proposal did not  demonstrate that it was in the best interests of PGE customers.

More significantly however, the regulator noted that they were unwilling to approve any proposal that imposed mandatory time of use rates on all customers.  In this they quoted Maryland Energy Administration witness, Fred Jennings ,in saying that before transitioning to time of use rates, it is critical that customers are provided:

  • sufficient education so as to understand the new tariff and how their behavior and decisions will affect their energy bill, and
  • the equipment and technology, such as in-home displays, orbs, electronic messaging, etc. to receive the requisite information that triggers behavior changes.

So, the debate continues.  Various people are staking out their claims on either side based on what outcome is most beneficial to them.  As with any new technology we will have to wait and see.  There will be successes and there will be failures.  Those failures may be the result of inappropriate technology choices but they are more likely to be the result of ill-conceived plans that fail to take the customers into account and secure buy-in from all stakeholders.  One thing is clear, with stimulus money on the table, we are going to see more applications for Smart Grid deployments from utilities that are anxious to get a piece of that action.  I just hope that other utilities take note of BGE’s failure and design their Smart Grid projects with the customer in mind and start to tackle the very real issues of how to get the customer engagement that will decide whether their projects succeed or fail.

How to engage the consumer in the Smart Grid revolution is a topic that engenders much debate.  Consumers are suspicious of utilities motivation when it comes to smart meters due to missteps by some utilities in Texas and California in managing customer buy-in and their failure to respond appropriately to complaints about errors, real or perceived, in the automated meter reads.  The concept of Home Area Networks and smart appliances also generates a lot of skepticism and many observers point out that consumers will not agree to allow utilities to reach into their homes and business to control demand.  There will be those, motivated by passion for technology or a desire to reduce their carbon footprint who will embrace smart meters, HAN’s, smart appliances etc but, at least initially, they will be in the minority.  Large scale commercial and industrial users are already working closely with the utilities to implement demand response or load control programs because the return on investment for these customers is significant but the industry needs a killer app to help launch the Smart Grid in the minds of small-medium size C&I customers and residential consumers.

I recently joined with some others in the Chicago area to launch the Green Technology Organization of Greater Chicago and we had a meeting on Thursday night at which Jay Marhoefer, CEO of Intelligent Generation, gave a presentation entitled Intelligent Generation: How green technology will democratize electricity.  Ok – Intelligent Generation is trying to sell a solution in this space but the presentation focused on defining the reality of today’s electricity supply industry and creating a vision for the future and was not a strong sales pitch for their particular solution. The feedback from those who attended the meeting was universally positive.

In his abstract, Marhoefer stated: Smart grid is just the starting point of the coming revolution in the electricity sector. What kind of role will other technologies like renewables, fuel cells and plug-in hybrids play? And is there a way for all of them to make electricity cheaper, cleaner, more efficient and more reliable?  The presentation provided an excellent overview of some of the key factors that drive decisions in the electricity supply industry and helped to define the context in which Smart Grid is trying to revolutionize the industry.  Some key realities of the industry that tend to get overlooked in much of the debate about renewable vs fossil fuels and Smart Grid implementation were clearly addressed.  Finally, Marhoefer brought it all together in a vision of a democratized future in which consumers actively partner with the utilities to manage electricity more effectively, unlocking the potential for real cost savings to consumers, reductions in demand and more efficient utilization of resources.

To understand the context in which Intelligent Generation works, it is necessary to have some basic understanding of how electricity supply works in the US today.  Many readers of this blog may be familiar with some of this context but I have not seen a more comprehensive explanation than that which Jay Marhoefer presented and so I would like to summarize that here with reference to some of the key slides from his presentation.

For most consumers, power is a commodity that they take for granted.  When they flick a switch, they expect lights to come on, appliances to start up and they give little thought to the system that exists on the other side of that switch.  At the high level, the electricity supply industry is relatively simple.  Large generating plants produce electricity from a variety of sources including coal, oil, natural gas, nuclear, wind, solar, biomass etc.  The resulting power is carried over high voltage transmission lines to centers of population where it is stepped down to lower voltages and carried on distribution grids to local homes and businesses where it is consumed.  A key challenge with the grid is to ensure a consistent voltage to match the varying demand.  For this reason, utilities have baseload generation that runs 24*7 and provides completely reliable, high quality power that meets a major portion of the demand.  During periods of higher demand, utilities need to acquire additional power to supplement this baseload generation capacity.  They do this by bringing on additional peaker plants, by purchasing power under contract from wind and solar providers or by buying power on the wholesale market which may be very expensive.  The high cost of these peak supplies are offset in most cases, by charging consumers a fixed rate for every KwH used, regardless of when the consumption occurs and regardless of whether the power that is being consumed is cheap baseload power or much more expensive peak demand power.

There are many factors that make the electricity supply market significantly more complicated than this simple model would suggest.  First of all there is the patchwork of utilities (some 3,000+ in the US alone), and corresponding regulatory environments at national, state and local levels under which those utilities operate.  Secondly, there is the issue of demand variability.  Demand varies significantly by season and by time of day.  Not surprisingly, in most areas, peak demand occurs on hot summer afternoons.  The second highest demand tends to be in the winter.  In some cases, the utility may own and operate the generation as well as transmission and distribution.  In other cases, utilities may be a wires and meters operation with no generating capacity and all of the power they sell may be purchased from generation utilities or other providers.  As demand rises above baseload capacity, the utility needs to be able to bring on additional sources of power to meet demand and keep the voltage on the grid steady so that consumers do not experience any shortage of power.  Nuclear and coal fired plants are used for baseload generation but are not suitable for responding to peak demand since they have startup times on the order of several days.  For this same reason, these plants are designed to run at full capacity at all times, even when demand falls below baseload capacity.  This is important as we will see later.  Load following natural gas plants can deliver power to the grid within 30-90 minutes and pure peaker plants can respond in as little as 15 minutes.  Wind power can be accessed in as little as 10-30 minutes (if the wind is blowing) and solar is instantaneous (assuming the sun is shining).  The intermittency of renewable sources such as wind and solar explains why these technologies will never be suitable for baseload generation but they are very well suited to providing peak demand capacity.  A corollary of this is that, when comparing the costs of wind and solar power, it is not appropriate to compare to baseload generation.  A true economic analysis of wind and solar power must be confined to comparison with other peaking capacity sources only which tend to be more expensive than baseload generation.

Looking in more detail at wind power, Marhoefer presented data showing average wind availability in the US.  These clearly show that, while there is good wind in the plains states, parts of the North East and the West, other parts of the country including Arizona and the South East are not suitable for wind generation.  Looking at this data seasonally however, we see that the availability of wind in the winter is much greater while, in summer it is much lower.  Wind also tends to be strongest at night and lightest during the day.  Herein lies one of the major problems with wind power: wind blows least when you need it the most and it blows most when you need it the least.

Another problem with wind is that most of the places where wind is most readily available are located far away from major urban centers where the demand for power is highest. There are  transmission lines that link most of the US into three major interconnects covering the East, West and Texas, but the price of electricity varies enormously from one part of the US to another and the regulations governing the movement of power within these interconnects are complex.  There is more than a 3:1 difference in price between the most expensive mainland state (CT) and the cheapest (WV).  Calls for a completely free and open market in electricity are likely to be blocked by politicians from those states where energy remains cheap because their constituents stand to lose in such an open market as power gets siphoned off to states where current energy costs are much higher.

For residential and small commercial and industrial users, solar is more attractive than wind for a variety of reasons.  Wind has more stringent zoning issues and, because wind power output varies exponentially with rotor size, it is less viable for small scale residential installation .  Data abstracted from AWEA’s Wind Energy Tutorial illustrate the relationship between rotor size and power output.  Unlike wind power, solar tends to track peak demand pretty well.  Hot summer afternoons when air conditioning use is high tend to be sunny.  However, the payback time for solar can be very long.  Even with a 30% federal tax credit, the payback period can be as much as 15 years.  In states that have mandatory renewable portfolio standards, consumers can sell Renewable Energy Certificates (REC) to their utility to further offset the costs but even with a REC value of $300/MwH, the payback period remains above 10 years.

So, what is the answer and what does Intelligent Generation mean when they talk about democratization of energy?  The industry solution is demand side management where consumers sign up for time of use programs in which the higher cost of peak electricity is passed on to the consumer who is thereby incentivized to adjust their consumption patterns and move demand into non-peak periods either manually or via a Home Area Network and smart appliances.  Utilities also promote load control programs in which consumers receive financial incentives in return for allowing the utility to directly control major appliances during critical demand periods.  Intelligent Generation proposes a supply side management alternative which involves residential solar, community wind generation, storage and distributed intelligence.

The first step in this model is to install distributed solar generation to take advantage of its peak demand following capabilities.  Remember that constant baseload capacity we discussed earlier that runs 24*7 regardless of demand?  The next step is to install battery storage capabilities to allow the consumer to purchase low cost power overnight or when excess wind power is available and store it for later use.  Finally, the IG Optimizer is a software solution that coordinates decisions at the consumer level.  It will decide when to buy power for storage and, in response to peak pricing signals from the utility, rather than adjusting consumption, it will switch to battery storage, thereby increasing the consumer’s savings and reducing peak demand on the utility. It may even sell power from the storage system back into the grid at a significant premium over the cost that it was purchased for.  The combined effect of this solution is to reduce the payback period for the system installation by 50%.  In a state where RECs are available, the REC revenue represents an ongoing income stream for the consumer even after the system has been paid for.  The appeal of this solution is apparent from an individual consumer level but, scaling up to a community or regional level, it offers the promise of a distributed network of renewable generation and storage owned and operated by consumers working in partnership with the utilities instead of being enslaved to the utilities.  Intelligent Generation estimate that 100,000 networked  buildings represent the equivalent of being able to bring a small nuclear power plant online instantaneously to meet peak demand.

Is this the killer app that the Smart Grid needs to capture the public’s interest and effect a transformation of our national and global relationship with energy?  Intelligent Generation clearly believes that it is.  It offers strong incentives to residential consumers, an ability to cap our baseload generating capacity and therefore our carbon footprint.  It maximizes the value of existing utility assets by providing a way to store the excess baseload generating capacity during periods of low demand and addresses the need for instantaneous response to peak demand events.  In conclusion, Marhoefer noted that, as with the PC revolution and the Internet revolution, those who democratize energy will be the major winners in the revolution that is happening within the energy supply industry.

A complete copy of Jay Marhoefer’s slides is available for download.

As noted in Part one of this article, I invited some of Smartsynch’s competitors and others within the industry to comment on the claims made by Stephen Johnston, Smartsynch’s CEO, concerning the case for utilities to choose public cellular networks over privately built and managed networks to implement the communications portion of their Smart Grid solutions.  Most declined to comment or told me that they were working on their own white papers to address these claims.

I did speak with Scott Propp at Motorola who believes that we will see a network of networks with technologies being selected based on very specific topologies since each technology will work best in a particular environment.  Propp feels that public cellular is very attractive for pilots because utilities can implement a small scale AMI solution to examine the impact on consumers and business processes but that when they need to scale up, the utilities will look at all technologies that are available to them.  A major issue that he sees arising between utilities and public carriers is the issue of service level agreements.  In his experience negotiations over such agreements are difficult.

I also spoke with Tom Hulsebosch at West Monroe Partners, a Chicago based consulting firm, who stated:

It is WMP’s opinion that typical smart grid communication solutions will be a combination of technologies. There are many things that a utility must consider when they are deciding on an AMI communication technology, including:

  1. What applications will be supported (e.g. Electric/Water/Gas Metering, Direct Load Control (DLC), Home Area Network (HAN), Distribution Automation (DA)/Sensors)?
  2. What are the latency and protocol requirements of the applications ?
  3. What is the meter/device density and  coverage Issues for the projected deployment?
  4. What does the utility currently have in their network?

Mesh solutions can provide great coverage in urban and suburban areas and their coverage/capacity can be customized for any area or application by adding more relays and collectors.  Yet, they are challenged to provide complete coverage in rural applications when the meter density does not support enough mesh paths from the meters to the AMI collectors.  Mesh systems are not typically going to be deployed by IOUs as their only AMI solution due to coverage challenges in the rural areas.  One of the advantages of the Mesh solution is that they can be optimized to meet the coverage and application needs of the utilities and there are multiple options to support water, gas, and electric meters.

The commercial carrier solutions have a couple of key advantages that make them a very interesting solution for many deployments due to their relatively low latency, good bandwidth, no requirement for a high meter/device density.  Today we are seeing utilities using commercial carrier based AMI solutions for deployment of C&I smart meters or strategic deployment of Smart Meters for hard to read meters or selectively for customers requesting HAN technologies and TOU rates.  We expect that these scenarios will continue, especially for utilities that have deployed AMR solutions recently or AMI solutions that don’t have HAN capability or the bandwidth to support the new service offerings.  Utilities also need to consider that many  commercial carrier solutions can also find themselves challenged when they need to provide coverage in very rural areas and in indoor locations like meter closets.  Many commercial carriers’ solutions don’t have an ability to repeat the signal through a neighboring meter which will increase the number of stranded devices that cannot be serviced by the commercial carrier coverage location.  There are limited cost effective ways to extend a commercial carrier’s coverage to meet the coverage requirement of a utility.  Another consideration is if the utility wants to cover gas and water meters with the AMI solution which require very low power communication solutions on these meters.

Other key AMI solutions include tower based solutions that use private licensed frequencies and power line carrier (PLC). The tower based solutions typically have the advantage of higher power and cleaner spectrum leading to fewer AMI backhaul locations and the ability to support hard to reach devices (water meters, in building meter closets, basement meters) and low density applications (rural).  Yet, the tower based solutions do require the creation of new towers (increased capital) or using existing commercial towers (increased O&M).  PLC solutions have been used in rural applications successfully for a long time and really represent some of the first successful AMI solutions deployed by utilities.  The PLC based solution is typically the solution of choice for the Electric Cooperatives that have a need for covering very low density areas where mesh and commercial carrier based solutions struggle to meet the needs of the utility.

At WMP, we do not see any one type of solution being able to meet the needs of all utilities.  We believe that the industry will continue to see a mix of different AMI solutions deployed in the utility market place for the foreseeable future.

I see some support for Smartsynch’s view in these comments but they fall short of a full endorsement.  WMP’s view is congruent with what Rick Thompson, President of Greentech Media stated in his opening remarks to the Networked Grid conference; “Despite the focus on physical layer networking ‘religion’ arguments, they are misguiding the industry, said Thompson. ‘It’s not going to be one or the other; it’s going to be all of them,’ depending on applications and service area requirements.”

Reader comments both here on the blog and in various LinkedIn fora support this view of a non-homogeneous technology future for the Smart Grid.  While some readers felt that Smartsynch made a compelling case, others cited personal experience of the unreliability of cellular in rural areas and argued for mesh technologies. Several readers noted that one technology cannot cover all topologies and requirements and that in all probability multiple solutions will be deployed even within a single service area.  It was noted that even if public cellular is used for meter data collection, utilities may still wish to maintain their existing investments in fiber or other networks for substation automation and may leverage those networks for expanded distribution automation which has stricter latency requirements than meter data.  Mixed technology networks were also suggested as possible AMI solutions with mesh or other technologies being used for data collection and the data then being aggregated and backhauled over cellular as is done by several utilities today.  Still others suggested that cellular technologies on private networks were the way to go.  And, of course, readers noted that the picture changes for gas and water meters where the meter cannot tap into the power supply as with electric meters.  Readers also noted that the underlying meter data are agnostic to the technology that is used to transport the data, as are the applications that act upon the data.

Undoubtedly, the debate will continue over which technologies are best suited to the communications requirements of the Smart Grid but, ultimately, the success or failure of an individual utility’s Smart Grid deployment and indeed the broader Smart Grid initiative will depend less on the particular technologies that are selected for the physical network and more on the development of applications, the so-called Soft Grid, that ride on top of that network and deliver meaningful value to both utilities and consumers.  Stephen Johnston and Smartsynch make a compelling case for why public cellular networks need to be seriously considered by any utility that is embarking on Smart Grid deployment but those utilities also need to consider the broader ecosystem in which their eventual network selection will operate.  As I have noted in earlier articles, the development of standards for interoperability between utilities and between vendors within a given utility, and the integration of applications into an overall business process framework that enables services that we cannot even imagine today, are the key factors that will lead to success.

Posted by: nialljmcshane | June 12, 2010

Technology Wars: Will Public Cellular Win in Smart Grid?

In a webinar organized by Energy Central on Wednesday, June 9, Stephen Johnston, CEO of Smartsynch reprised the comments that he made at Greentech Media’s  Networked Grid conference that was held in Palm Springs CA on May 18-19 concerning his belief that public cellular networks would dominate the smart grid space.  This topic clearly attracts a lot of interest and it was reported that over 800 people were on the webinar representing public cellular carriers, utilities, regulators and Smart Grid network suppliers.

Johnston noted that utilities have been using public cellular for connecting to commercial and industrial customers for demand response for a long time and cellular is also used for backhaul of AMI data from aggregation points in mesh networks but addressed four main historical objections to the use of public cellular networks:

  • Cost: When Smartsynch started working with utilities in 2000, the cost per meter per month was around $15.  While this was acceptable for large commercial and industrial customers, who typically represent only 10% of customers but as much as 60-70% of the demand, it was prohibitive for the residential consumer market.  This price declined steadily over the years but was still sitting around $8 in 2009 when a dramatic change took place and the cellular operators realized that there was a large market to be accessed for little or no incremental network cost.  Prices dropped dramatically to around 50c per meter per month and are continuing to trend downwards.
  • Coverage: Smartsynch claim that 97% of the service area in the US is covered by cellular and, in a recent trial they were able to demonstrate 99.96% meter reading success rates.  This number actually reflects the percentage of the population within the service areas that are covered and also reflects an aggregate of all cellular providers rather than any specific network.  Another important factor when considering coverage metrics is that, unlike handheld cell phones that typically have a maximum transmit power of 0.6 W, fixed wireless devices such as smart meters can transmit at up to 2W which significantly extends the coverage area of any network that they are connected to.
  • Bandwidth: Concerns about missed data due to network congestion are a common reason cited for why utilities should build private networks.  However, Smartsynch  quote a statistic that, if all the water, gas and electric meters in the US, about 300 Million in total, were to transmit a day’s worth of 15 minute interval data every day, this would represent only 1/500th of 1% increase in the data volumes carried daily by a network such as AT&T.  They also note that, using quality of service settings, the cellular operators can prioritize utility traffic to guarantee delivery even when the network does become congested.  Johnston also pointed out that the cellular operators will continue to improve network bandwidth and reliability at no incremental cost to the utilities.
  • Security: Finally, on the topic of security, Smartsynch point out that the data transmitted over public cellular networks is not travelling over the public internet.  A significant amount of personal and financial data are transmitted over public networks which are NIST and DoD compliant and cellular operators continue to spend large sums of money on network security improvements.  Johnston claimed that the cellular architecture eliminates the multi-point vulnerability of many mesh network technologies and also pointed out that the newer IP security standards require higher bandwidth which the cellular networks can provide.

On the issue of total cost of ownership, Johnston acknowledged that the residential endpoints for connecting to the cellular network are approximately 10-30% more expensive than for competing technologies but noted that these endpoints have higher capabilities and intelligence than the endpoints used by other technologies and, unlike many competing technologies, there is no requirement for aggregation points or repeaters to complete the network connections.

In summary, Johnston claimed that

  • Cellular operators have a strong incentive to leverage their existing networks for Smart Grid communications.
  • Cost, coverage, bandwidth and security attributes of these networks are better than the alternatives and are getting better every day
  • Service level agreement with multi-billion dollar network companies represent reduced risk for utilities whose core competence is not in managing communications networks
  • Embracing open high bandwidth cellular networks will spur innovation by many vendors and not just a single vendor tied to a proprietary private network technology.

During the Q&A session that followed, a number of additional concerns were brought up and addressed:

  • In response to a question about the risk of technology rollover and meters being stranded, Johnston suggested that this could be mitigated by having a strong relationship between the utility and the chosen cellular operator and also noted that new technologies generally support backward compatibility with existing devices.  This is certainly true for devices that have a 10 year depreciation cycle but may not be true for equipment that is depreciated on a longer cycle as is often the case in the utility industry.
  • In relation to the important question relating to the fact that utilities in the US are allowed, by their regulators, to earn a rate of return on capital investment such as that required to build a private network but not on operating costs such as those paid to a cellular operator, Johnston pointed out that the recent FCC broadband plan recommended FERC to look at this rule so that utilities are not forced to make sub-optimal network choices.  The implication is that there could be a rule change in this area.  However, I would also note that even with a private network, the utility still incurs operational costs to maintain that network and those operational costs would also be subject to the rule that prohibits any earned rate of return.    At some point, the operational costs of paying the cellular bill will be lower than the costs of owning and maintaining a private network.  The point at which this occurs will vary depending on the price per meter per month for the cellular connection and the size of the utility’s private network.
  • In response to a question about reliability of cellular networks especially during emergency situations, Johnston noted that, during hurricane Katrina in New Orleans, the public cellular networks were among the last to fail and the first to be recovered.  He also pointed out that private network infrastructure was susceptible to destruction in such an event and would have to be replaced at the utility’s expense whereas any destruction of cellular network equipment would be replaced at the operators expense.

I invited representatives of several of Smartsynch’s competitors to comment on the claims that were being made in favor of public cellular networks but they either declined or stated that they were working on their own white papers to respond to these issues.  In part two of this article, I will provide feedback that I received from some parties who were willing to comment and will also discuss what this all means in terms of the overall Smart Grid deployment.

In the meantime, what are your views on this issue?  Do you agree with the arguments in favor of public cellular networks?  Are there other issues that were not addressed in this discussion?

Posted by: nialljmcshane | June 5, 2010

The Smart Grid Data Tsunami: Myth or Monster?

Continuing the exploration of key themes from Greentech Media’s  Networked Grid conference that was held in Palm Springs CA on May 18-19 this column looks at the issue of exploding data volumes and the impact on utility business processes.

As I reported in my summary of the conference, Andy Tang of PG&E stated that the Smart Grid was not a specific thing or project but rather it was about how the utilities leverage technology to enhance their entire portfolio of business processes.  Linda Jackman, Group VP of Product Strategy and Management at Oracle Utilities gave a very engaging and entertaining keynote address on day 2 of the conference in which she echoed this view.  She noted the need for utilities to think about their application portfolio in a different way: moving from a silo-based operational view to a true enterprise view of data needs and value.

Jackman also quoted a forecast that says Smart Grid will increase the data volumes handled by utilities by 700% to around 800 TBytes.  With this increase in data volumes, comes a series of challenges for the utilities:

  • What to keep, what to discard and how long to retain each piece of data
  • Where is it required  across the organization
  • Interoperability of applications to avoid data duplication
  • Overarching authority for key systems
  • Security and privacy of customer data

While this increase in data volume is significant, we must also recognize that other industries including telecom and financial services routinely deal with comparable data volumes today and have evolved mechanisms for handling the challenges that go along with these volumes of data.

In his workshop on Power Layer Infrastructure Technologies and Network Communications Layer Architectures, Erich Gunther, Chairman and CTO of Enernex expanded on this topic, noting that, historically, utilities have built backoffice applications in isolation.  As a result, when the need arises to use data from multiple applications in a combined way, the integration is typically done as a custom project at considerable expense to the utility.  When applications are built in this way, they tend to have significant structural issues including redundancy and inconsistency of data across the system, inability to scale the system as demand grows and inconsistent approaches to user interfaces, network management capabilities, security and data protection.  Implementing these functions in legacy systems as an afterthought is always more difficult and more costly than having them designed in from the start and, in many cases, implementation barriers prevent a fully integrated solution from ever emerging in legacy systems.

The applications that make up a Smart Grid enabled utility’s portfolio will include:

  • AMI/AMR and meter data management systems: automated meter reading capabilities and the associated backoffice systems to manage the meter data for billing and customer awareness purposes.
  • Demand Management: Applications aimed at increasing the efficiency of the transmission and distribution grid, responding to peak demand events via increased supply or demand response programs, increased integration of renewable energy sources, including distributed, non-utility owned sources, energy storage etc.
  • Grid reliability and Stability: Automation of configuration management, monitoring, fault detection, isolation and recovery of the transmission and distribution network via both hardware and software means
  • Microgrids: monitoring and controlling the microgrid infrastructure and its interface to the macrogrid.
  • EV Supervisory Systems:  monitoring vehicle and battery information, controlling supply of electricity and providing real-time pricing and load information.
  • Spatial Analytics: Integration of network status and events in relation to network topology in a geographical context.
  • Mobile workforce management: real time scheduling and crew optimization

What is notable about these applications is that they cut across many operational silos and functional disciplines and they include hardware, software and service components.  Even among these applications, there are interface points and shared data.  For example, smart meters that are part of an AMI deployment can signal distribution faults and power quality issues which can be used to trigger automated fault recovery actions or be integrated with spatial analytics and mobile workforce management to expedite dispatching of a repair crew for faults that cannot be automatically recovered.

To be effective, data needs to be able to flow freely and easily within and among these applications and they need to have well defined API’s to enable separate application modules to be integrated in a cost effective manner.

If successfully implemented, these applications will enable an energy information economy and home energy services market that spurs innovation in the sector.

An interesting analogy is Amazon’s Kindle e-reader technology.  For many of us, the Kindle is a pretty neat device that allows users to download and store multiple books, periodicals, newspapers etc and read them on the go, even in bright sunlight.  While the Smart Grid industry is struggling to define the killer app that will create consumer demand for smart meters, the Kindle has that problem licked.  But extend the analogy a little further and a more significant difference emerges.  To Amazon, the Kindle isn’t merely a device, it’s a service and the entire service has been architected from the top down to deliver maximum customer value and return on investment to the business.  The Kindle device is merely the customer interface point for a vast array of technology including cellular network connections to download content, marketing, customer account management and backoffice systems to manage users’ purchases, subscriptions, preferences etc.

Newer software architectures have been developed that define standard interfaces incorporating security, network management, standardized data models and other considerations from the outset.  These architectures are designed to be modular and scalable.  New functionality can be added easily by designing (or purchasing) a module that plugs into the standard interfaces.  Existing functionality can be upgraded by replacing a module with a newer module from the same, or a different vendor.

One particular model that is worth considering is the Frameworx model from the TM Forum, an industry association with over 700 members including service providers, network operators, network equipment suppliers, software suppliers, system integrators, content providers and consumer electronics manufacturers.  The Frameworx model provides a service oriented approach using standard, reusable components to enable service providers to rapidly build and deploy new services, reduce operational costs and improve business agility.  The Frameworx model comprises:

  • Business Process Framework (eTOM) is the industry’s common process architecture for both business and functional processes.  This component provides a standard set of business processes that provide architectural and design components to fully define organizational structures and product solutions. This can facilitate the purchase and integration of commercial solution packages by providing a reference model against which to evaluate the functionality of those products.  It also provides a reference to guide the generation of requirements for in-house development of specific business process components.
  • Information Framework (SID) provides a common reference model for Enterprise information that service providers, software providers, and integrators use to describe management information consistently across the enterprise.  This takes the form of definitions of common entities such as customer, service, network and the attributes that describe such entities thereby helping to normalize data representations across the enterprise and simplifying the integration of components into an end to end system.
  • Application Framework (TAM) provides a common language between service providers and their suppliers to describe systems and their functions in terms of the business processes that the implement and the information components that they operate on, as well as a common way of grouping them
  • Integration Framework provides a service oriented integration approach with standardized interfaces and support tools

To add further value, TM Forum is also working on a service generation toolkit based on the Frameworx model which can be use to generate a web services implementation of a service along with Java code stubs for the application.

There is a range of commercial products that support the Frameworx architecture and TM Forum has been working with other industries including the digital media industry and financial services industry to expand the model beyond its origins in telecom.  There is an opportunity to leverage the decades of work that TM Forum has done to benefit the utility industry and at least one Australian utility has already used this model to implement an operational support system for parts of their business.

Without doubt, the utility industry has a challenge ahead as it enters a new networked, intelligent environment.  There will be false starts and unexpected successes.  There will be winners and losers.  But there is no need to start from scratch.  The data volumes that utilities need to handle are not uniquely large and most of the data management problems that the utility industry is facing have already been solved to a greater or lesser extent.  Yes there are unique aspects to the utility environment and these will need to be identified and addressed but we cannot afford to reinvent any wheels if we are going to realize the benefits of the Smart Grid in a timeframe that allows us to cap the ever increasing demand for energy and the consequent environmental impact of that demand.  Our economy and our environment demand that we leverage the tools that already exist to the greatest extent possible and evolve them to meet the needs of the Smart Grid.

I would like to acknowledge the assistance of John Sheehy, Juan Nodar and Dave Milham of the TM Forum in reviewing and providing input for this article.

Posted by: nialljmcshane | May 30, 2010

Electric Vehicles: Challenges and Opportunities for the Grid.

May 30, 2010

Electric Vehicles (EVs) were another of the key themes of the Networked Grid conference that was held in Palm Springs CA on May 18-19.  In this column I will discuss the emergence of a credible electric vehicle market and some of the opportunities and challenges that this presents for the industry.

Electric vehicles are not a new idea.  The earliest electric cars emerged in the mid 19th century and, in the early years of the 20th century they competed effectively with gasoline powered cars.  Over time however, advances in internal combustion technology, reduction in noise and vibration, and other improvements led to the pre-eminence of gasoline powered cars which offered greater range and reduced fueling time compared to their electric counterparts.  EV’s continued in the form of electric powered trains and trams and in niche markets such as fork lift trucks.  There have been many attempts over the years to re-introduce the concept of electric cars but in each case, issues with charge times, battery technology and “range anxiety” – the fear of running out of power with no available charging stations nearby, have contributed to the failure to significantly penetrate the mass market.

Now, due to increased oil costs, concerns about the link between carbon based fuels and climate change, the environmental impact of oil and gas exploration as evidenced by the current oil spill in the Gulf of Mexico, and improvements in battery technology, charging infrastructure and new business models, we appear to be on the brink of a breakthrough in mass market adoption of EV’s.

The Nissan Leaf which will launch later this year is expected to retail for around $25,000 after tax incentives.  It claims a range of 100 miles although heater or AC use will reduce this.  However, Nissan suggests that the range reduction can be minimized by pre-heating/cooling the vehicle while it is still charging.  Charging options include a 110/120V 20A trickle charge cable that plugs into a regular US household electrical outlet.  For faster charging, Nissan recommends a custom installed 220/240V 40A charging dock that reduces the charge time to 8 hours.  This model works well for single family homes in the affluent suburbs.  It is less clear how this will work in apartment buildings and condominiums or in inner-city areas where few people have home garages in which to install the charging stations.  This may be another factor that leads to uneven uptake on EV’s across the territory of a utility.  A 480V quick charging capability will eventually be available but the cost of this type of charging station means that it will likely be confined to commercial charging stations and will not be suitable for home use.  Autoblog has more info on the Nissan Leaf.

Chevrolet’s research indicates that 75% of people in the US commute 40 miles or less each day and their 2010 entry into the EV market, the Chevy Volt, has a range of 40 miles but also has an onboard gas generator that can extend the range to hundreds of miles on a full tank of gas.  Charging time is quoted as 10 hours or, with a 240V charging station, as little as four hours.  Autoblog has also reviewed the Chevy Volt.

In addition to these all-electric cars, there is also a range of Plug-in Hybrid Electric Vehicles (PHEVs) including converted versions of standard hybrids such as the Toyota Prius, Honda Civic etc.  The Leaf, Volt and the PHEV’s are basic family cars.  At the upper end of the spectrum, for those EV enthusiasts with over $100,000 to spare, there is the Tesla Roadster which claims a range of 245 miles and a motor providing 288 HP, 0-60 mph in 3.9 seconds and a top speed of 125 mph (electronically limited).  Those are Tesla’s parentheses, not mine!

So with all these exciting developments, it looks like EV’s may be here to stay.  They offer lower cost to operate, reduced dependence on foreign oil, environmental benefits and a certain caché that comes with being an early adopter of the next big thing.  But, what does this mean for the electric industry?

One major concern for the utilities is the impact of the additional current drain that will be required to charge these vehicles.  It is assumed that most EV owners will opt for the 240V chargers to provide a guaranteed overnight charge.  These chargers represent a draw on the grid that is equal to or greater than an average home: the Nissan Leaf, drawing 40A at 240V represents 9.6KW demand.  The grid is architected with local transformers for every 6-10 homes on average.  One EV on a transformer probably won’t create any problems but the fear is that these vehicles will be adopted in clusters.  If we start to see 3 or 4 EV’s on a single transformer, there is a very real concern that the additional load will cause those transformers to fail, especially if the vehicles are allowed to charge during peak hours.  Demand response is a key Smart Grid solution that needs to be in place to enable the expected growth in EVs, but there are unique aspects of demand response that come into play here.

During a panel discussion on Smart Grid and Electric Vehicles at the Networked Grid event, Matthew Crosby, Regulatory Analyst at the California Public Utilities Commission noted that CPUC is working on ways to link charging of EV’s to available wind capacity.  As wind capacity ramps up, EV’s would be allowed to charge faster but when the renewable power on the grid drops, the EV charging would slow down.  This concept can be extended to integrate charging stations with residential or commercial generation capacity from wind, solar or other sources.  Community microgrids with their localized ability to control supply and demand and to shape power supply based on priority may offer a more comprehensive solution to this issue, especially if adoption of EV’s is unevenly distributed throughout the macrogrid.

New companies are emerging who are building out an infrastructure of charging stations to address the issue of range anxiety.  Some of the leaders in this field who were part of the Networked Grid panel on EV’s include; Better Place, AeroVironment and Coulomb Technologies.  More than just charging stations however, these companies are offering vehicle charging services to their customers using advanced networking technologies.

  • Coulomb believes that the value of their company is in networking and software, not hardware.  They monitor their charging stations every 14 minutes and provide an iPhone app that will guide drivers to the nearest unused, in-service, charging station.
  • Better Place offer a battery swap service in addition to charging stations.  They believe that by owning and maintaining the batteries themselves, they can extend the life of the battery by ensuring that it is charged under ideal conditions and they can eliminate the charge time concern for their customers.

The custom charging stations offered by all of these companies include sophisticated technology to implement demand response based on signals from the utility.  They also contain technology to allow the EV’s to act as distributed storage for the grid.  This technology, known as Vehicle to Grid (V2G), could in theory be used to regulate voltage on the grid or to meet peak demand without the need for peaking generators.  However, Richard Lowenthal, CEO of Coulomb Technologies stated categorically that he does not believe V2G is going to happen.  There are a number of concerns with this technology including the fact that the customer does not want to return to their vehicle to find the battery depleted as a result of V2G activity.  Other issues include the potential impact on battery life of additional charge-discharge cycles and the consequent effect on manufacturers’ warranty obligations and the fact that technically, in the current EV’s that are coming to market, the charging stations do not have a direct connection to the battery to implement V2G but are constrained to work through an on-board charger within the vehicle.  Rob Bearman of Better Place believes that they will be able to lower the cost of charging if they can generate revenue via V2G and, because of Better Place’s battery swap model, they would be in a position to provide battery storage for the grid using offline battery banks.

Matthew Crosby discussed a debate about whether CPUC and other public utility commissions who currently regulate electricity utilities have the jurisdiction to regulate the pricing structures for public, residential and commercial charging services.  Companies like Better Place and Coulomb don’t actually sell electricity by the KWh as utilities do.  Coulomb charges by time and Better Place by mile.  They argue that they are no different from other businesses that sell a product that uses electricity as an input to deliver a service (e.g. Laundromats).  The input (electricity) is regulated.  The outputs (clean, dry clothes) are not.  CPUC actually ruled on this question on May 21 and sided with the charging infrastructure providers in agreeing that the sale of electricity as a motor vehicle fuel did not make the operator of that facility a utility.  They will now move on to look at distribution grid readiness, and associated costs and benefits of various options to enable a viable market for EVs in California.

Clearly there is a lot of work ahead for regulators, utilities, charging infrastructure providers and EV manufacturers but it does appear that a new era of affordable, economical, clean energy transportation is at hand and the industry is ready to embrace the technical and business challenges and ensure mass market success for this new generation of EV’s.

Posted by: nialljmcshane | May 27, 2010

Microgrids: A New Architecture for the Electricity Grid

May 26, 2010

In my last posting, I summarized the highlights and key themes of the Networked Grid conference that was held in Palm Springs CA on May 18-19.  For many people, the Smart Grid is about smart meters, advanced metering infrastructure and demand response but the scope of the Smart Grid is so much broader than this and so, in this column, I would like to dive a little deeper into the trend towards de-centralization of generation with the emergence of distributed generation and increased demand for microgrids.   This aspect of the Smart Grid is perhaps the most exciting because it challenges the entire architecture that we have established for electricity generation, transmission and distribution over the past 100 years.

Before discussing the role of microgrids in the Smart Grid it is important to establish an understanding of what a microgrid is: It was clear from the panel discussion on microgrids at the conference that there isn’t a single definition of what a microgrid is.  Some participants were discussing microgrids in terms of residential sized systems while others were discussing campus or community wide systems.  Some gave the impression that the presence of distributed generation was sufficient to qualify a project as a microgrid but distributed generation alone does not define a microgrid.

Wikipedia defines a microgrid as a localized grouping of electricity sources and loads that normally operates connected to and synchronous with the traditional centralized grid (macrogrid) but can disconnect and function autonomously as physical and/or economic conditions dictate.

I’m sure there are other definitions and this one is not perfect.  As somebody on the panel suggested when this was discussed, by this definition, you could make an argument that the Texas interconnect is a microgrid.  Let’s accept this as a working definition, however, as it seems to be a reasonable place to start.

Pike Research forecasts that over 3 GW of new microgrid capacity will come on line globally by 2015, representing a cumulative investment of $7.8 billion.  North America will be the largest market for microgrids during that period, capturing 74% of total industry capacity.  In North America, the largest category will be institutional microgrids, followed by commercial/industrial and community grids.  In other regions, however, the story is different and [they] expect community microgrids to be the largest category in Europe and Asia Pacific.

It is important to note that a microgrid is not necessarily green: The generating plants in a microgrid may be renewable energy sources such as wind or solar energy or they could be fuel cells, biomass or gas turbine generators.

When power is cheap on the macrogrid, the microgrid can purchase power at market rates.  However, when demand spikes and the true cost of electricity rises, the microgrid has the capability to meet its own needs from local generating sources helping to reduce demand on the macrogrid.  Depending on local regulations and agreements with the local utilities, microgrids may sell power back into the macrogrid during periods of peak demand.  The sale of energy back into the macrogrid may occur at market rates or, in some regions, generous feed-in tariffs may apply for energy from renewable sources.

In August 2003, the great Northeast Blackout affected some 10 million people in Ontario Canada and up to 45 million people across 8 states in the US for up to two days.  Five years later, Scientific American reported research by a team at Carnegie Mellon University in Pittsburgh which showed that, despite enhanced regulations designed to prevent such events, the number of blackouts affecting at least 50,000 people in the US had stayed relatively constant at 12 per year between 1984 and 2006.

Because of their ability to operate independently of the macrogrid, reducing the impact of large scale blackouts, microgrids have been promoted by researchers at UC Berkeley as a way to improve grid reliability and reduce dependence on the long distance transmission grid.  Another advantage that they cite for microgrids is the opportunity to recapture heat that is often a wasted byproduct of electricity generation and use it to heat buildings in the immediate vicinity of the generating station.  This co-generation of heat and power is particularly attractive because it helps to make the economics of microgrids comparable to large central generation models.

An essential element of what makes a microgrid is the capability to control the balance of generating capacity and demand within the confines of the microgrid itself.  This control is essential to ensuring a stable supply of energy to the power consumers served by the microgrid regardless of whether the microgrid is connected to the macrogrid or “islanded” and operating independently.  Another unique aspect of a microgrid is the fact that, because the microgrid is owned and operated by the customers that it serves, it can differentiate between users that have extremely high power quality requirements and those who have lower needs and it can prioritize the power supply to those essential services within its community.

Microgrids can be implemented by different owners to meet different objectives.  Large industrial and commercial users might implement a microgrid to provide cheap combined power and heat generation.  Industrial users with very high power quality requirements might design and implement a microgrid that assures them of a supply that meets their demanding specifications which cannot be realized on the macrogrid.  Government entities such as the DoD are especially interested in microgrids from a power security perspective.  In the event of a coordinated attack on the nation’s power infrastructure, the capability of a microgrid to island from the macrogrid provides resiliency and security for essential services.

In their book, PERFECT POWER: How the Microgrid Revolution Will Unleash Cleaner, Greener, More Abundant Energy, Bob Galvin and Kurt Yeager of the Galvin Electricity Initiative note that the idea of a decentralized model for generating and delivering electricity was what Thomas Edison had originally envisioned.  However, a lack of appropriate technologies to allow such a model to scale on a national basis led to the development of a centralized model in which economies of scale and large scale generation technology advances created a more profitable, manageable, accessible and affordable solution for what was, in the early part of the 20th century, relatively modest electricity demand.  With improvements in technology for small scale distributed generation, computerized controls that allow local grids to optimize their utilization of local or centralized generation and the capability to disconnect from the macrogrid and operate independently when needed, the time has come to re-think the existing model of generation, transmission and distribution.  Galvin and Yeager also note the potential of microgrids to bring electrification to the many parts of our world that do not have access to affordable, sustainable, reliable power today without the need for the costly transmission networks that exist in most western nations.

The Galvin Electricity Initiative has three core objectives:

  • Drive regulatory reform based on a set of Electricity Consumer Principles
  • Develop Perfect Power systems.  They have completed two systems including one at the Illinois Institute of Technology and another at Mesa del Sol, a sustainable community in New Mexico.
  • Raise Awareness through a media and advertising campaign, speaking engagements and personal outreach to key stakeholders.

At the Networked Grid event, a panel discussion on The Microgrid Emergence: Distributed, Intermittent Renewable Power and Storage featured Tom Bialek, Chief Engineer, Smart Grid at San Diego Gas & Electric, Andrew Bochman, Energy Security Lead at IBM, Jack McGowan, CEO, Energy Control Inc and Terry Mohn, VP and Chief Innovation Officer, Balance Energy.  The panel was moderated by David Leeds, Smart Grid Analyst at Greentech Media, who were the hosts for the conference.

Jack McGowan worked with the Galvin Electricity Initiative on the microgrid at IIT and his company is currently working on a project with the University of New Mexico.  Andy Bochman is working with DoD to reduce dependency on local power companies for US military bases.  Terry Mohn’s company develops commercial microgrids using only renewable sources of generation.

In response to a question about the biggest drivers for microgrid development in the US, the panel noted:

  • Utilities are interested in developing microgrids in order to meet their obligations to meet demand and improve reliability.
  • Renewable Portfolio Standards in many states are forcing many utilities to look at microgrids employing rooftop solar on customer premises because they cannot obtain approval to build new transmission lines to bring renewable power from the plain states or desert southwest.
  • For the DoD, energy security is the overriding concern.  Vulnerabilities associated with fuel price and availability as well as concerns about the fragility of the macrogrid are forcing them to evaluate microgrid alternatives.  Scenario planning by the DoD calls for military bases to be able to run autonomously for multiple weeks.  Also, due to negative reactions to the situation during hurricane Katrina when the military bases around New Orleans were fully powered while the local community had no power, DoD wants to look for ways that they can provide power to essential services off base .
  • There is also a strong economic driver for customers who have the option to sell power back into the grid.
  • Another factor that is important for some utilities is the regulatory practice of decoupling utility profits from increased sales of electricity which is a mechanism that regulators can use to incentivize utilities to implement energy conservation and decentralized generation programs that are of benefit to the wider community but which may result in lower overall electricity sales by the utility.  Note that some consumer advocates take issue with decoupling because it guarantees utility company profits while reducing their risk exposure to lower sales that may result from these initiatives.

One major challenge that the panel noted in the widespread adoption of microgrids was the need to standardize the design of the microgrid so as to avoid costs associated with customizing the installation for each customer.

Another key challenge is a regulatory issue.  Today, in the US and in many other markets, only a public utility or a government entity is authorized to run wires that cross a public street.  This necessarily limits the scope of a microgrid installation to a campus or facility scale.

However, perhaps the biggest challenge to a widespread adoption of microgrid technology, at least in the modern, industrialized nations, is the need to change the incentives for the utilities so that they embrace this technology as a way to improve grid reliability, even though it may result in lower electricity sales and lower profits for the utilities themselves.  If this technology really takes off in the first world, this will help to drive down the cost of adoption for the emerging markets where the potential to bring cheap reliable electricity will provide a huge improvement in the living standards of so many people in the world.

May 16, 2010

The Networked Grid conference in Palm Springs exceeded all of my expectations in terms of providing a forum to learn more about the Smart Grid and an opportunity for networking.  The event was very professionally organized and was neither so big that it was impossible to connect with people nor so small that the major players were not in attendance.  I was able to interact with people from many firms in different segments of the industry including utilities, meter manufacturers, AMI networking companies, regulators, data management firms, security providers, transmission and distribution equipment manufacturers and the analysts from Greentech Media who hosted the conference.

In the opening keynote address, Rick Thompson, President of Greentech Media and David Leeds, Smart Grid Analyst at Greentech Media presented what they see as the top 5 trends in the industry at present.

Rather than diving into detail in this posting, I would like to highlight a couple of particularly notable observations that were made over the course of the conference:

  • It was reported that Duke Energy canvassed 70,000 customers for adoption of a new variable rate pricing plan and only 20 signed up.  In a later panel, the 70,000 number was challenged and an alternative number of 10,000 was suggested.  Either way, the uptake was statistically insignificant and this tells us something about customer engagement in Smart Grid.
  • Another interesting statistic that was quoted was that at least 80% of the US public has no idea what the Smart Grid is.  Certainly among people that I talk to, that number doesn’t seem out of line with my experience.
  • It was encouraging to hear Andy Tang, an executive at the PG&E utility in California, state that the Smart Grid is not a thing or a specific project. It is about how the utilities leverage technology to enhance their entire portfolio of business processes.  In fact there were numerous comments about the need for a system-of-systems approach to implementing the Smart Grid and making sure that all the integration points that will enable the maximum integration of intelligence across the enterprise are being realized.
  • On the other hand, Another utility executive expressed concern that the emergence of distributed generation technologies that are affordable at the upper echelon of the homeowners marketplace could lead to more affluent homeowners and communities separating from the grid and leaving the utilities to service only the less affluent, creating a differentiation between the haves and the have-nots in energy.
  • Terry Vardell of Duke Energy Company made a statement to the effect that demand response was a temporary solution to manage peak demand for a couple of years until the utilities got their generation portfolios to a more sustainable level.  Frankly I think this idea is very far fetched.  With demand continuing to increase, issues with permitting for new generation plants and transmission lines and the reality that nobody is about to go through a wholesale decommissioning of the coal and gas fired plants that we have today, I believe that demand response in some form is a critical technology for the long term.  I should note that other utility execs viewed demand response as the cheapest solution for peak demand issues so the view expressed above appears to be an outlier even within the utility community.
  • A discussion on the Smart Grid’s killer app was notable for its failure to identify any new ideas about what that killer app might be.  There was some discussion of demand response and electric vehicle charging as the killer apps but these are known potential applications of Smart Grid technology already and each has some issues around consumer uptake which would be necessary for them to be truly killer apps.  This shouldn’t be a surprise however.  The killer apps will emerge only when the infrastructure is in place and innovators can begin to see the possibilities to create those applications.
  • Stephen Johnston, CEO of Smartsynch, an AMI solutions provider who partner with commercial cellular operators raised some debate with a bold prediction that commercial cellular was going to be the technology of choice for all utilities going forward due to new and more competitive pricing now being offered by the major cellular operators at the rate of pennies per meter per month compared to $15 per meter per month 10 years ago.  He makes a strong case but the underlying issue is that utilities, at least in the US, are regulated in such a way that they can get a return on investment for the capital costs of building private networks but not for the operating cost of paying a cellular provider to use their public network.  Of course, if the operational cost is low enough, it can undercut the operational cost of running the private network so there is clearly a tipping point.  The exact price at which that tipping point occurs will vary based on the size of the utility and the commercial cellular approach may be especially attractive to the small and mid-sized utilities.
  • Erich Gunther, CTO of Enernex gave a wonderful workshop that presented an overview of the Smart Grid based on the NIST Smart Grid Conceptual Model.  I highly recommend this model to anybody seeking to learn more about the scope of the Smart Grid.
  • Richard Lowenthal is CEO of Coulomb technologies who manufacture EV charging stations which include Vehicle to grid (V2G) technology that would allow EV’s to act as a distributed storage asset for the grid.  Despite this, he stated unequivocally that in his opinion, he does not believe that V2G will ever become a reality because the consumer will not want to come back to find their vehicle battery having less charge than when they left it and because of concerns related to additional charge/discharge cycles on the battery life and hence the manufacturers’ warranty obligations.

Several threads ran through the event which I will expand on in future postings.  These include:

  • The battle over technology choices, standards and the reality that, in the end, multiple technologies will be successful as the networks segment into tiers based on different latency, bandwidth, availability and coverage requirements.
  • The impact of the anticipated arrival of affordable electric vehicles not only on demand but also on grid reliability and regulatory policy.
  • The trend towards de-centralization of generation with the emergence of distributed generation and increased demand for microgrids.  These are self contained entities ranging from small rural installations to large campus facilities that contain generation, distribution and possibly storage assets and the intelligence to be able to function independently of the main grid either at the behest of the utility or at the desire of the owner of that infrastructure.
  • The impact of regulation on the emergence of the Smart Grid and the various technologies that surround it and the steps that regulators are taking to move the industry towards a smarter, cleaner, more sustainable future.
  • Issues around customer engagement including market segmentation, the value of data vs information vs a managed service that provides real intelligence and guidance to consumers, concerns about privacy and security and the role of technology in driving higher ROI for consumers.
  • The expected surge in data volumes that are produced by the Smart Grid, the technical and business challenges this presents in terms of managing those data and the potential value that can be unlocked from integrating the data across the business processes and developing analytics around the resulting information.

Finally, an irony that was not lost on me was the fact that we were in Palm Springs where the daytime high temperature on Wednesday was 97°F (About 36 C), talking about the importance of reducing peak electricity demand, but the conference center was so over-air conditioned that many people were shivering in their seats.  When we stop and think about how many of the world’s 6 Billion people have limited or no access to electricity and contrast that with the perception of electric power as a basic right and expectation, not only in the US but in the western world in general, it really helps to put the importance of the Smart Grid in perspective.  Think for a moment about the cost of fuel and the impact on the environment if the emerging nations and third world countries were to emulate the western world in their consumption patterns for electric power with the same mix of generating technologies.  And why shouldn’t they seek to emulate what they see in the developed world.  Electric power and all the electronic gadgets that demand more and more of it every year have added enormously to our productivity, comfort and quality of life.  With a few exceptions, the conference focused on the impact that the Smart Grid will have in the US and, to a lesser extent, other developed nations.  But, as with so many other technologies, what we develop to tackle problems that are a by-product of our affluence can also be used to help address problems that result from the poverty that we see around the world.  In the US and other developed nations, we built large scale transmission and distribution grids because that was what was required to transport power from large centralized generating stations to the demand centers where that power would be consumed in an era where localized generation and particularly renewable generation were not viable options.  The capital cost of building transmission and distribution in rural parts of India or Africa is overwhelming but microgrids offer an alternative.  Just as many rural areas in India and Africa have completely bypassed landline telephone systems and gone directly to cellular, so too, they will bypass the centralized generation, transmission and distribution model and leverage new, lower cost,  wind and solar generating technologies together with storage to offset intermittency in order to provide electric power to their communities.

Posted by: nialljmcshane | May 16, 2010

Chicago’s Clean Energy Economy

May 16, 2010

This week, I am travelling to Palm Springs CA to attend the Networked Grid 2010 conference.  In addition to being a great opportunity to network with some of the top names in the smart grid industry, I am also looking forward to a very full and exciting agenda of discussions and presentations that will serve to enhance my knowledge and understanding of the smart grid.  That should provide plenty of material for updates to the blog in the coming weeks.

Last Tuesday, May 11, I attended an event entitled Growing Chicago’s Clean Energy Economy co-presented by the Environmental Law & Policy Center, Chicagoland Entrepreneurial Center and Chicagoland Chamber of Commerce and sponsored by Crain’s Chicago Business.  The event included several panel discussions featuring representatives of the City of Chicago, the State of Illinois, the host organizations and several clean energy firms based in the Chicago area.  Refer to the end of this article for a list of the key participants and the organizations and firms that they represent.

In his opening remarks, Howard Learner of the Environmental Law & Policy Center noted that the state of Illinois has one of the best renewable power standards in the country.  This includes a state mandate for 20% renewable generation by 2020 with a specific carve-out of 6% for solar power.  This solar power carve-out is especially important because solar generation peaks during the afternoon hours and actually leads the peak demand cycle, thereby offering an effective way to reduce demand for coal and oil fired peaking plants.  State and federal policies as well as economic changes are also expected to reduce electricity demand by 15% by 2020.

Suzanne Malec-McKenna, Commissioner, Department of Environment, City of Chicago, spoke about the Chicago Climate Action Plan which drives the city’s approach to energy efficient building codes, clean and renewable energy sources, improved transportation options, reduced waste and industrial pollution and adaptation to changes that are already occurring in our environment.  She noted that this is a living dynamic document that must be constantly updated to respond to changes that occur in state and federal regulations and initiatives.

Manny Flores, Acting Chairman of the Illinois Commerce Commission, the main regulatory authority in Illinois spoke about a number of exciting green developments occurring in Chicago.  These included the Chicago Green Exchange, a sustainable retail and office development and the 35 acre Lathrop Homes site which is being redeveloped following the new LEED for Neighborhood Development guidelines from the US Green Building Council.

I had an opportunity to talk privately with Chairman Flores in the break and he referred me to the Illinois Statewide Smart Grid Collaborative and urged that representatives of industry and communities need to become more actively involved in contributing to the discussions that are taking place around the smart grid strategies for the state.   Among other assets available at this site is the February 22, 2010 Facilitator’s Progress Report to the Illinois Commerce Commission which details the progress made by working groups dealing with Smart Grid Applications and Technologies, Consumer Policy Issues and Technical Characteristics and Requirements.  Two additional working groups dealing with the Cost-Benefit Framework and Utility Smart Grid Filing Requirements had yet to start work at the time of the February report.  The collaborative is scheduled to complete work by October 2010 so there is still an opportunity for interested parties to get involved and help to shape the outcome.

Jay Marhoefer is the founder and CEO of Intelligent Generation, made a number of interesting contributions to the discussion from the floor urging the participants to think less conventionally in terms of large centralized generation owned by the utilities and focus instead on the parallels with the PC and internet revolutions and the potential for democratizing the generation of power.  Intelligent Generation has developed a hardware and software solution that optimizes energy cost savings and reduces the payback time for wind and solar generation capacity by determining the most cost effective time to acquire energy from distributed generation sources, or the grid and storing that energy for use when it is most valuable.  This is a great example of the type of innovation that is occurring around smart grid.

In response to a question about the key technologies that universities should be teaching and researching in order to drive the wind power industry forward Andy Cukurs, CEO, North America, Suzlon Wind Energy Corporation, replied that he would like to see more emphasis on remote monitoring and diagnostics and better transmission and interconnection technologies.  These are all technologies associated with the smart grid and are not unique to wind power.   He also mentioned the need for improved coating technology that would prevent ice build-up on the turbine blades which can cause inefficiency in operations.  Another great example of this type of innovation, is the evolving technology for energy storage at substations or in local communities as referenced in an article in Eco Periodicals, and supplied by another Chicago based company, S&C Electric.

There was a good deal of discussion about the regulatory environment.  In Europe and other parts of the world, generous feed-in tariffs that allow operators of solar or wind power plants to sell electricity back to the utilities at higher prices on long term contracts have been used to stimulate the growth of the clean energy economy.  In the US, a system of tax credits is used but there are several limitations with these credits.  For one thing, they cannot be directly used by a startup company that has no profit to pay taxes on and so there is a secondary market to allow such firms to sell these credits in order to monetize them to help fund their investment.  Another problem with the US approach is the lack of security for companies due to the fact that many of these tax credit schemes are temporary and subject to renewal on a periodic basis.  However, as Madeline Weil of the Environmental Law & Policy Center noted, overly generous feed-in tariffs helped to boost the manufacture of Solar PV products but then some of those tariffs were revised downwards, causing a drop in demand which has caused a dramatic drop in the price of solar PV but has put manufacturers of this technology at some risk.

Perhaps the most well received statement of the entire event came during the Q&A portion of the final panel on the issue of clean energy financing.  Michael Eckhart, President of the American Council on Renewable Energy, noted that the biggest impediment to renewable energy financing is the fuel cost pass thru which essentially absolves fossil fuel based generating plants from having to account for the cost of fuel in preparing their rate cases to the utility commissions.  Based on discussions with people in the industry, he believes that if gas fired generating plants were required to lock in a price for 20 years as is often the case with wind and solar plants, the price that they would have to bid would rise to around 15c/KwH which would be on a par with or higher that current wind and solar rates in many markets.

Among the community and policy leaders participating were:

Industry representation included;

  • Sean Casten: President and CEO of Recycled Energy Development, a firm that specializes in capturing waste energy from industrial processes and turning that into heat and power.
  • Andris Cukurs: CEO North America of Suzlon Energy Corporation, a wind turbine manufacturer and operator of wind farms.
  • Peter Duprey: CEO of Acciona Energy North America, who are involved in building sustainable infrastructure, desalination and all forms of renewable power.
  • Declan Flanagan: CEO of Lincoln Renewable Energy, a developer of solar and wind power projects.
  • Amy Francetic: Managing Director & Founder of Invention Bridge, who provide innovation consulting services and investment capital to help commercialize science based research.
  • Pete Kadens: President of SoCore Energy, an innovator and pioneer in the solar industry providing solar power solutions to municipal and commercial clients.
  • Lois Scott: President of Scott Balice Strategies, a provider of public and corporate finance expertise.

« Newer Posts - Older Posts »

Categories