The Dirty Job of Data Provisioning in Energy Utilities

Somebody’s Gotta Do It

Dirty Jobs was a TV series on the Discovery Channel hosted by Mike Rowe performing the difficult, strange, disgusting, or messy parts of people’s occupations. Every episode since it was first aired in 2005 has had exactly the same opening:

“My name's Mike Rowe, and this is my job. I explore the country looking for people who aren't afraid to get dirty — hard-working men and women who earn an honest living doing the kinds of jobs that make civilized life possible for the rest of us. Now, get ready to get dirty.”

But no one really likes talking about or doing the boring job of data cleaning; about making data accessible; about data integration; about the unappealing yet critical task of data provisioning.

Data provisioning—the process of making data available in an orderly and secure way to users, application developers, and applications—is a significant challenge for most utilities. With many widely varying demands for data, the job of data provisioning is more important than ever. Yet the task has many challenges, some unique to the utilities sector. For example, utilities have geographically distributed data sources and an equally distributed workforce. They have mission-critical operational systems that must be insulated from uncontrolled access and they are holders of large quantities of confidential data which must be kept secure.


The Dirty Job of Data Provisioning

The energy revolution is driven by a lot of cool and innovative developments which are happening right now; new business models; distributed generation and microgrids; renewables and prosumers; digital twins; digital transformation; artificial intelligence and machine learning.

Most ‘techies’, both from the big players and startups, prefer talking about hot topics such as AI, machine learning or data analytics. Other popular issues include predictive maintenance; asset health; real-time energy flow analysis; EV charging impact monitoring; peer-to-peer trading; apps; dashboards; visualizations. You name it, the list goes on.

All these innovations have one thing in common: They are all heavily dependent on data, but it has to be good data. So, what do we mean by good data? It must be - consistent data; complete data; actual data; accurate data. However, few people really talk or focus on the boring “Why” and “How to” of good data.

There has never been a better time to explore the “dirty job” that makes the energy revolution possible. The dirty job that enables all these innovative data driven energy services. The dirty job not many, especially tech people, like to do the dirty job of data provisioning.


Dirty Data strangles Innovation

Utilities are innovating and experimenting with emerging technologies. They are piloting these projects to learn and gain experience. Running a pilot with AI or Machine Learning is one thing. Getting those “first mover” projects into full-scale deployment and generating real business value is much more difficult.

So why is that? For a PoC (Proof of Concept), utilities typically use a, relatively speaking, tiny data set which is specifically compiled and manually extracted and prepared for the purpose of the pilot.

This is in stark contrast to the reality that exists in the next phase: the rollout. To realize the promised operational benefits in a real productive setting, “good data” is necessary - correct data; accurate data; actual data; complete data. And enough data to provide the diversity needed for advanced analytics.

And here’s the crux: Many utilities struggle big time with “Dirty Data". Wikipedia defines “dirty data,” also known as rogue data, as inaccurate, incomplete or inconsistent data.

“Rubbish in, rubbish out". Research shows that organizations typically spend more than 80% of all IT efforts just on preparing data or making “good data” accessible. This means spending time and effort fixing the dirty data problem. How can this be achieved? By integration data cleansing, unification, structuring and managing access to data. The reality in most utilities today is that less than 20% of IT spending is used to innovate or deliver new functionality.


Aging IT - One of the Causes for Dirty Data

Utility data contains an abundance of useful information for new service innovation. But too often, it’s collected inefficiently, and valuable details fall through the cracks.

Utility data has so much potential value. It contains an abundance of information that can be harnessed for new service innovations. However, too often it’s collected inefficiently and much of that potential value falls by the wayside and is wasted.

There are many causes of dirty data. User errors; too many manual routines; poor communication between organizational departments and skill shortages are all contributing factors. However, an out-dated IT landscape and the typical ‘spaghetti’ integration approach are often crucial issues that must be resolved.

Many energy companies operate monolithic and non-integration-friendly applications with proprietary technology from the big vendors. Utilities have a history of taking shortcuts, especially in the field of integration, for example, using file-based data exchange or batch stylish point-to-point integrations with a lot of manual data crunching.

Aging IT landscapes and mindsets are the perfect breeding ground for dirty data. They also provide a hostile environment for any data driven innovation.


Let’s make Data great again

Heading a Big Data iPaaS company, I have tried many different pitches over the last few years to explain how our Utilihive platform can accelerate our utility clients’ digital transformation. As CEO and Co-Founder, I’m a passionate believer in the benefits that a flexible, event-driven, cloud-native integration platform can bring to utilities. A digital integration hub enables utilities to integrate and access all of the data they need to innovate and be successful in the ongoing energy revolution.

Most innovative utilities understand the value of “good data” and the importance of a modern digital platform handling the data provisioning challenge in an effective way. Therefore, whenever I meet new clients or partners, I open with:

“My name is Thorsten, and this is Greenbird’s job: Data Provisioning. We aren’t afraid to get dirty. We are solving your integration challenges. We are cleaning data; structuring data and we are making data easily accessible. We do the dirty 80% job. Now get ready to innovate!”

So now you might be thinking “yes, that all makes sense, but how do I do this and what steps should I take?” My next article will focus on the “how” and “what.” I will provide a unique insight from lessons learned from experiences and engagements with many utilities around the world. I will highlight what the CIOs and CDOs top priorities are, how they drive their digital transformation and which capabilities they see as the core components in a future-perfect digital platform.

Stay tuned.

About the Author

Thorsten Heller is the CEO, Co-Founder and Chief Innovator at Greenbird Integration Technology AS providing Utilihive, the Operating System for Smart Energy Services and the Utility Domain specific iPaaS empowering the SmartGrid, SmartCity and Industrial Internet of Things.

Follow Thorsten on LinkedIn and Twitter.

Related stories