An Enterprise Architecture Blueprint for Utilities – Part 5:

Technology, Dev Tales, Blog posts

“Strategy is figuring out what not to do.”

(Steve Jobs, Co-Founder and CEO of Apple)

In the last post (part 4) of my Enterprise Architecture Blueprint series, I proposed a slightly modified version of Gartner’s “Pace Layered Architecture” for the Utility 4.0.

Gartner defines its approach as follows: A pace-layered application strategy is a methodology for categorizing, selecting, managing, and governing applications to support business change, differentiation, and innovation.

Let us quickly recap:

Utilities operate a critical infrastructure and provide vital services to all people and society as a whole. They must, at least from an IT perspective. balance security, resilience, reliability and stability with innovation, agility, and speed of delivery. Therefore, the “Pace layered Architecture for Utility 4.0” proposes the following layers:

  • System of Record (SoR): These are packaged applications, typically Commercial off-the-Shelf products (COTS) or legacy homegrown systems. They support your organization’s “commodity” business capabilities which handle core transaction processing or manage your reference or master data.
  • System of Connectivity (SoC): A modern data integration layer based on the concept of an Event Driven Architecture (EDA) and reactive microservices. These systems handle all data flows and data exchange between all the Systems of Record and also between the layers to establish the entirely digitized operational and business processes that define the Utility 4.0.

    This layer highlights the value of a modern data management approach and turns the spotlight on the importance of the “dirty job of data integration”.
  • System of Intelligence (SoI): A big energy data management and analytics infrastructure layer used to deliver the cognitive capabilities for the Utility 4.0. These capabilities include smart grid edge analytics to create insight and transparency in the network. Further examples of cognitive use cases include intelligent grid operations, asset health, predictive maintenance, demand / load forecasting, EV charging optimizations and identification of abnormalities to name just a few.

    This layer fosters the cognitive capabilities the future utility needs in order to achieve the Sustainable Development Goals and its own business objects.
  • System of Engagement: This is a digital user experience layer. It provides the services and digital excellence the digital native consumers expect, but also the utilities’ employees need so they can handle the huge amount of data now flooding in to their organizations.

Utilities must balance stability and security with speed of delivery. to foster innovation and to accelerate the digital transformation that is required to operate the future distributed energy system.

Architecture Strategies

Now I would like to discuss some architecture ideas or strategies building further on this architecture concept.

To use the words of Steve Jobs, Co-Founder and CEO of Apple:

“Strategy is figuring out what not to do”. Let us start by discussing “Make vs. Buy” strategy.

Many utilities I work with have begun to understand the value of having their own IT personnel, (enterprise) architects, data scientists or development resources in-house. Many organizations are building up both competency and capacity. And this is great.

Where should utilities put their focus? Should they spend their efforts and resources in developing core operational support systems or business applications? In other words, on those applications that typically are “commodity” or “non-differentiating”? And then spend even more resources on operations, in supporting, or maintaining those systems?

Instead, should utilities focus on using their own people and resources in delivering innovative and differentiating services?

I think it is obvious that it would not make sense for your utility to invest in the development (plus operations, plus maintenance, plus support, plus security updates, plus technologies upgrades) of a home-made MDM / VEE, CC&B / CRM, asset management or GIS. Applications in the SoR layer are rarely differentiating and provide widely used and similar capabilities across many utilities.

SoR layer: I would recommend a “Buy” strategy for the Systems of Record. Utilities can operate “Commercial off-the-shelf” solutions with minimal customization. However, open APIs and interfaces enable automated and digitized processes and data integrations. I would even suggest a “SaaS-first” strategy to minimize investment and to foster the “vanilla”- implementation concept.

SoE layer: I would suggest a “Make” strategy for the Systems of Engagement. Utilities can then build an entirely digitized user experience facing both consumers / clients and employees supporting a utility specific customer journey for both external and internal users. And, as we need maximum flexibility and elastic scalability, I would further suggest a “Cloud-first” strategy, meaning the entire SoE would operate in a cloud environment.

SoI layer: I would recommend a “Make” strategy for the Systems of Intelligence. Utilize your organization’s energy data by leveraging modern big data, data science, analytics, AI, or Machine Learning technologies to create the cognitive solutions that are specific and unique for your business, your region and geography, and for the regulations you operate in. Utilities should consider a “PaaS” strategy. This offers the benefits of provided platform services and empowers your data scientist, analyst or developers to use and integrate with their preferred tools (i.e. Tableau, PowerBI), libraries or frameworks (i.e. Tensorflow, Apache Spark) or programming languages (i.e. Python, R).

SoC layer: I would recommend a “Buy” strategy for the System of Connectivity layer, coupled with a “Cloud-Native” platform strategy.

To be fair, I am biased. We in Greenbird provide the leading iPaaS purpose-built for utilities, our Utilihive solution. It offers both integration and data lake infrastructure. In addition, Utilihive comes with pre-configured utility-specific building blocks and data integration accelerators (i.e. connectors, data flows, orchestrations, data services, analytics dashboards, etc.).

“For every dollar you spend on an application, you spend 5 on integration. Why?” This really doesn’t have to be the case.

Should your utility really spend time on system integration? Or work hard to break up silos, cleanse data and make information accessible? Or should you instead utilize the “good data” to develop the ‘System of Intelligence’ services that drive the energy transition instead?

To me, the answer is clear. I think the SoC layer should be based on an Enterprise-iPaaS providing a kind of “Out-of-the-Box” solution to solve the system integration and big data management challenge. In the future, many applications will be operated in the utility’s data center or private cloud due to regulations and security concerns. This suggests the SoC should be “cloud-native” and support real hybrid integration scenarios.

Coming back to Steve Jobs and his quote about figuring out what not to do, I think he might have said: “Stay hungry. Stay foolish – but don’t make your own SoRs and don’t build the SoC yourself. Focus on what creates unique value for you - the SoIs and SoEs!”.

In my next post, I will continue to develop the concept of the cognitive utility. Are you ready to learn about this concept?