Enterprise Architecture Blueprint for Utilities – Part 3:

Technology, Dev Tales, Blog posts

“I am a Man of Principle”

“I am a man of fixed and unbending principles, the first of which is to be flexible at all times” – a quote from former US Senator Everett McKinley Dirksen. Most probably he thought about politics in his role as senator, but the statement would equally suit coming from enterprise architect, too!?

In part 1 – “The Good, the Bad and the Ugly” of our Enterprise Architecture Blueprint, I focused on technical debt utilities have built up over the years due to an “Accidental Architecture”. In part 2 – “Bad choices make good stories,” I identified some KPIs for a good architecture. And now, in this post, it’s time to move on and talk about architecture principles.

Architecture Principles

According to the Open Group Architecture Framework better known as TOGAF, principles are general rules and guidelines, intended to be enduring and seldom amended, that inform and support the way in which an organization sets about fulfilling its mission. TOGAF details: Architecture Principles are a set of principles that relate to architecture work. They reflect a level of consensus across the enterprise, and embody the spirit and thinking of existing enterprise principles. Architecture Principles govern the architecture process, affecting the development, maintenance, and use of the Enterprise Architecture.

TOGAF has laid out a set of 21 principles wherefrom some experts would prioritize those 8 as the ones you need to know. As a man of principles, let me introduce some of the most important ones that, at least in my opinion, that are applicable for the future digital utility.

Interoperability and Integration-friendly: The digital utility of the future requires entirely digitized value chains and automated processes. To achieve this, all applications and systems should be built with integration and interoperability in mind and expose documented interfaces using modern integration methods.

Standardization of Data and Meta Data: The Common Information Model (CIM) emerges as a de facto standard for the power industry and for transmission and distribution service operators. In addition, we see more and more vendors are leaning towards CIM as their de facto standard. Implementing a unified information model inspired by CIM creates measurable benefits for utilities.

Loose coupling: Utilities should get rid of technical debt as fast as possible. To do so, dependencies to given applications, vendors or proprietary platform technologies must be reduced or minimized. Loose coupling between systems and services increases exchangeability and flexibility.

Modularity: The future utility must be able to quickly adapt to new opportunities, regulations, or market changes. As many utilities still operate huge monolithic applications, they are having to compromise and wait often several months for minor changes or investing hundreds of thousands of Euros for alterations or additional functionalities or fields in a user interface. A modular application landscape creates the foundation for the future success.

Event driven architecture: As more and more smart meters and sensors are deployed in the electrical network and at the grid edge, utilities must be able to handle a tsunami of big energy data in real time. The traditional process driven integration approach will fall short: Orchestrations get too complex; error or exception handling gets extremely complicated; performance; and message throughput will lag. Integration will be a nightmare for both development, operations, and management. Implementing the concept of an event driven architecture is key to building a resilient and reactive integration system. A system which is designed and built for scaling as needs change and demands require.

API driven development: The entire industry is moving towards a distributed energy system forcing utilities to operate a distributed and modular IT and data architecture. API driven or API-first development fosters a modern IT landscape with holistically digitized and automated integrations.

Cloud native and containerized applications: Many utilities are on their cloud journey. Most having set off with a single cloud strategy. However, as every cloud provider (Microsoft Azure, Google Cloud Platform, Amazon AWS, or other) has its own strengths and drawbacks, utilities are recognizing the benefits of embracing on a multi-cloud strategy with cloud native and containerized applications: leveraging multiple cloud or computing services (public cloud, private cloud), but in an integrated and distributed cloud-mesh architecture.

Data Mesh: The future utility is data and analytics driven. Many energy companies have recognized the value of data and identified the potential benefits with implementing a data lake. To avoid the same drawbacks experienced and typically associated with an “Enterprise Warehouse” approach, that included sizeable investments in terms of expenditures and resources to build and manage a centralized and monolith data warehouse platform, utilities should instead focus on implementing a distributed data lake, also known as a data mesh.

I am sure there are several other useful architecture principles. So, I encourage getting a discussion started to establish a set of best practice architecture principle for utilities.

In my next blog post, I’ll focus on a “Pace layered Architecture” for energy companies and introduce the concept of the “Cognitive Utility”. Ready to read the next post?