Posted by: nialljmcshane | June 5, 2010

The Smart Grid Data Tsunami: Myth or Monster?

Continuing the exploration of key themes from Greentech Media’s  Networked Grid conference that was held in Palm Springs CA on May 18-19 this column looks at the issue of exploding data volumes and the impact on utility business processes.

As I reported in my summary of the conference, Andy Tang of PG&E stated that the Smart Grid was not a specific thing or project but rather it was about how the utilities leverage technology to enhance their entire portfolio of business processes.  Linda Jackman, Group VP of Product Strategy and Management at Oracle Utilities gave a very engaging and entertaining keynote address on day 2 of the conference in which she echoed this view.  She noted the need for utilities to think about their application portfolio in a different way: moving from a silo-based operational view to a true enterprise view of data needs and value.

Jackman also quoted a forecast that says Smart Grid will increase the data volumes handled by utilities by 700% to around 800 TBytes.  With this increase in data volumes, comes a series of challenges for the utilities:

  • What to keep, what to discard and how long to retain each piece of data
  • Where is it required  across the organization
  • Interoperability of applications to avoid data duplication
  • Overarching authority for key systems
  • Security and privacy of customer data

While this increase in data volume is significant, we must also recognize that other industries including telecom and financial services routinely deal with comparable data volumes today and have evolved mechanisms for handling the challenges that go along with these volumes of data.

In his workshop on Power Layer Infrastructure Technologies and Network Communications Layer Architectures, Erich Gunther, Chairman and CTO of Enernex expanded on this topic, noting that, historically, utilities have built backoffice applications in isolation.  As a result, when the need arises to use data from multiple applications in a combined way, the integration is typically done as a custom project at considerable expense to the utility.  When applications are built in this way, they tend to have significant structural issues including redundancy and inconsistency of data across the system, inability to scale the system as demand grows and inconsistent approaches to user interfaces, network management capabilities, security and data protection.  Implementing these functions in legacy systems as an afterthought is always more difficult and more costly than having them designed in from the start and, in many cases, implementation barriers prevent a fully integrated solution from ever emerging in legacy systems.

The applications that make up a Smart Grid enabled utility’s portfolio will include:

  • AMI/AMR and meter data management systems: automated meter reading capabilities and the associated backoffice systems to manage the meter data for billing and customer awareness purposes.
  • Demand Management: Applications aimed at increasing the efficiency of the transmission and distribution grid, responding to peak demand events via increased supply or demand response programs, increased integration of renewable energy sources, including distributed, non-utility owned sources, energy storage etc.
  • Grid reliability and Stability: Automation of configuration management, monitoring, fault detection, isolation and recovery of the transmission and distribution network via both hardware and software means
  • Microgrids: monitoring and controlling the microgrid infrastructure and its interface to the macrogrid.
  • EV Supervisory Systems:  monitoring vehicle and battery information, controlling supply of electricity and providing real-time pricing and load information.
  • Spatial Analytics: Integration of network status and events in relation to network topology in a geographical context.
  • Mobile workforce management: real time scheduling and crew optimization

What is notable about these applications is that they cut across many operational silos and functional disciplines and they include hardware, software and service components.  Even among these applications, there are interface points and shared data.  For example, smart meters that are part of an AMI deployment can signal distribution faults and power quality issues which can be used to trigger automated fault recovery actions or be integrated with spatial analytics and mobile workforce management to expedite dispatching of a repair crew for faults that cannot be automatically recovered.

To be effective, data needs to be able to flow freely and easily within and among these applications and they need to have well defined API’s to enable separate application modules to be integrated in a cost effective manner.

If successfully implemented, these applications will enable an energy information economy and home energy services market that spurs innovation in the sector.

An interesting analogy is Amazon’s Kindle e-reader technology.  For many of us, the Kindle is a pretty neat device that allows users to download and store multiple books, periodicals, newspapers etc and read them on the go, even in bright sunlight.  While the Smart Grid industry is struggling to define the killer app that will create consumer demand for smart meters, the Kindle has that problem licked.  But extend the analogy a little further and a more significant difference emerges.  To Amazon, the Kindle isn’t merely a device, it’s a service and the entire service has been architected from the top down to deliver maximum customer value and return on investment to the business.  The Kindle device is merely the customer interface point for a vast array of technology including cellular network connections to download content, marketing, customer account management and backoffice systems to manage users’ purchases, subscriptions, preferences etc.

Newer software architectures have been developed that define standard interfaces incorporating security, network management, standardized data models and other considerations from the outset.  These architectures are designed to be modular and scalable.  New functionality can be added easily by designing (or purchasing) a module that plugs into the standard interfaces.  Existing functionality can be upgraded by replacing a module with a newer module from the same, or a different vendor.

One particular model that is worth considering is the Frameworx model from the TM Forum, an industry association with over 700 members including service providers, network operators, network equipment suppliers, software suppliers, system integrators, content providers and consumer electronics manufacturers.  The Frameworx model provides a service oriented approach using standard, reusable components to enable service providers to rapidly build and deploy new services, reduce operational costs and improve business agility.  The Frameworx model comprises:

  • Business Process Framework (eTOM) is the industry’s common process architecture for both business and functional processes.  This component provides a standard set of business processes that provide architectural and design components to fully define organizational structures and product solutions. This can facilitate the purchase and integration of commercial solution packages by providing a reference model against which to evaluate the functionality of those products.  It also provides a reference to guide the generation of requirements for in-house development of specific business process components.
  • Information Framework (SID) provides a common reference model for Enterprise information that service providers, software providers, and integrators use to describe management information consistently across the enterprise.  This takes the form of definitions of common entities such as customer, service, network and the attributes that describe such entities thereby helping to normalize data representations across the enterprise and simplifying the integration of components into an end to end system.
  • Application Framework (TAM) provides a common language between service providers and their suppliers to describe systems and their functions in terms of the business processes that the implement and the information components that they operate on, as well as a common way of grouping them
  • Integration Framework provides a service oriented integration approach with standardized interfaces and support tools

To add further value, TM Forum is also working on a service generation toolkit based on the Frameworx model which can be use to generate a web services implementation of a service along with Java code stubs for the application.

There is a range of commercial products that support the Frameworx architecture and TM Forum has been working with other industries including the digital media industry and financial services industry to expand the model beyond its origins in telecom.  There is an opportunity to leverage the decades of work that TM Forum has done to benefit the utility industry and at least one Australian utility has already used this model to implement an operational support system for parts of their business.

Without doubt, the utility industry has a challenge ahead as it enters a new networked, intelligent environment.  There will be false starts and unexpected successes.  There will be winners and losers.  But there is no need to start from scratch.  The data volumes that utilities need to handle are not uniquely large and most of the data management problems that the utility industry is facing have already been solved to a greater or lesser extent.  Yes there are unique aspects to the utility environment and these will need to be identified and addressed but we cannot afford to reinvent any wheels if we are going to realize the benefits of the Smart Grid in a timeframe that allows us to cap the ever increasing demand for energy and the consequent environmental impact of that demand.  Our economy and our environment demand that we leverage the tools that already exist to the greatest extent possible and evolve them to meet the needs of the Smart Grid.

I would like to acknowledge the assistance of John Sheehy, Juan Nodar and Dave Milham of the TM Forum in reviewing and providing input for this article.



  1. The applicability and usability of the TMF frameworx model to energy utilities is spot on. We have succesfully utilized the billing operations and Revenus Assurance KPI models to develop utility-specific professional services offerings for our company. to me, as a telcp industry veteran just getting to know the utilities world, it looks and smells very much like telco did 10 years ago post-regulation. I can only hope that folks will pay attention to the lessons learned as documented by folks like TMF and GRAPA. Utilities do not need to go through the same back-office mistakes that have severely impacted the industry, and sent many telco start-ups into bankruptancy.

    • Eric – thanks for the positive response. I will contact you offline to see if you are interested in a project to help leverage the Frameworx model for utilities.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s


%d bloggers like this: